Sonic the Hedgehog 2020: we reveal the soundtrack, trailer, cast and release date – Classic FM

30 January 2020, 13:31 | Updated: 30 January 2020, 13:33

The speedy blue hedgehog first found his fame as the protagonist in Segas popular video game series and now theres a new movie inspired by his adventures. Heres everything you need to know about the live-action remake and its soundtrack.

Adventure comedy film, Sonic the Hedgehog, tells the story of a blue anthropomorphic hedgehog who comes to Earth from another dimension in an attempt to escape corrupt forces.

Based on Segas 1991 video game franchise of the same name, the live-action remake stays true to Sonics character and design, giving him the ability to run at supersonic speeds and curl into a ball as a means of attacking his enemies.

After Sonic accidentally causes a mass power outage on Earth, the government hires evil scientist and inventor, Dr. Robotnik, to hunt him down and use his powers for world domination.

Keen to evade capture, Sonic joins forces with Tom Wachowski, a friendly human sheriff who quickly becomes his best friend and does all he can to help the hedgehog.

Read more: Michael Jackson probably composed the music for Sonic the Hedgehog so why is he uncredited? >

A debut for American filmmaker Jeff Fowler, Sonic the Hedgehog features a stellar cast including popular actor, writer and comedian, Jim Carrey, who plays evil Dr. Robotnik.

Carrey is well known for his whacky film roles, including previous appearances in the 1994 comedy classics, The Mask and Ace Ventura: Pet Detective.

Meanwhile, American actor, singer and former model, James Marsden, has been cast as Tom Wachowski, while actor Ben Schwartz (known for House of Lies and for portraying Jean-Ralphio Saperstein on the NBC show, Parks and Recreation) provides the voice and facial motion capture of lead character, Sonic.

When Sonic first hit our screens as part of Segas video game series, Masato Nakamura (bassist and songwriter of the J-pop band, Dreams Come True), was commissioned to write the soundtrack.

For the upcoming movie, Dutch DJ, composer, producer and multi-instrumentalist, Tom Holkenborg, has penned the score.

Also known as Junkie XL, or JXL, the 52-year-old composer first became known for his trance productions, and has made remixes for the likes of Scissor Sisters, Depeche Mode and Fatboy Slim.

Since then, he has moved into the world of film and worked with the legendary Hans Zimmer on Man of Steel and Batman v Superman: Dawn of Justice, as well as penning the scores for Deadpool, Mad Max: Fury Road, and Tomb Raider.

The official soundtrack for Sonic the Hedgehog is still to be released, but if its anything like the music in the trailer (watch above), then were expecting plenty of video game nostalgia.

The official release date for Sonic the Hedgehog is January 25 2020 in the UK and 14 February 2020 in the US.

Follow this link:

Sonic the Hedgehog 2020: we reveal the soundtrack, trailer, cast and release date - Classic FM

The power of language in The Topeka School – The Trinitonian

Illustration by Andrea Nebhut

When Ben Lerner was my age, he was a Fulbright Scholar, he was in the running to be a MacArthur fellow and he would go on to enter into many other clubs that end with the illustrious title of fellow. That preface is to maintain the fact that my review of his book The Topeka School doesnt come from the standpoint of a seasoned literary reviewer but from a growing admirer, one who stumbled upon the book because it was shortlisted for another sort of award. That is, it was on Barack Obamas list of favorite books he read in 2019.

Set in Ben Lerners hometown of Topeka, Kansas, the book follows members of the Gordon family Adam, Jane and Jonathan with each chapter dedicated a specific character. None of the family stories throughout the novel follows a linear chronology but each starts in 1997.

Adam is a debate star with a quintessential high school romance weighing on his mind, Jane enters a prosperous time in her writing career, and Jonathan is in a period of malaise in his psychiatric practice. The pace and direction of the novel is led by the books other main character Darren. His chapters are brief, styled in italics and intend to capture the mind of a developmentally disabled individual.

From there, Lerner jumps the reader through varying points in each of his characters chaotic lives. At times, these jumps are disorienting and the perspective of his characters are unclear; the only mooring is the beginning of a new chapter.

In between the sections that focus on the Gordon family is the timeline of Darren, a man trapped in adolescence, an adolescence that Lerner uses to symbolize America. Lerner reveals through the characters of his novel how they each display their own degrees of adolescence, each a part of regressive engine that has become the American zeitgeist.

Much of The Topeka School is eerily beautiful, from its descriptions of Adams addled mind to its haunting retelling of the traumas of Janes life. Yet what I believe to be the most powerful aspect of the novel is Lerners playful and deliberate use of language and how through each of the characters stories. Lerner discusses the various techniques Adam uses to harness the dominating force that is language, juxtaposing that with Janes use of language as a conduit for the pains and struggles of being a mother.

Comparing to both of them, Lerner uses Jonathan as a vessel to discuss the idea of the breaking down of language or the complete forgoing of conventional speech patterns and the speaker entering a trance-like state where they have no idea what the words they speak are or how they know them, yet they understand them.

This deconstruction of language is mirrored in the novel by its subtle documenting of the slowly growing divide in America. Lerner alludes to how middle America grew isolated and hateful while the coasts grew elite and arrogant and how a fundamental breakdown of language as a means of communication thus created what Lerner calls the spread. The spread is a debate tactic where the speaker overwhelms their opponent with a gargantuan amount of arguments, leaving them no time to cover all of them and all the arguments that go unchecked are then scored as a victory for the user of the spread.

For Lerner, the spread is how America uses language not as a tool of communication and understanding, but one of dominance and power, where one doesnt intend to listen to the other side but just wants to silence them with their own thoughts.

His novel doesnt end with a solution to the problem. In many ways his novel is more a report to the academy that is America right now, meant to stifle the current discourse and wake us up from the daze we are in; to stir America from the adolescent slumber it has been in for far too long.

Post Views: 12

Excerpt from:

The power of language in The Topeka School - The Trinitonian

Stryker (SYK) Q4 Earnings and Revenues Surpass Estimates – Yahoo Finance

Stryker Corporation SYK reported fourth-quarter 2019 adjusted earnings per share (EPS) of $2.49, which beat the Zacks Consensus Estimate by 1.2%. Further, the bottom line improved 14.2% year over year and exceeded the high end of managements guidance range.

The Michigan-based medical device company reported revenues of $4.13 billion, which outpaced the Zacks Consensus Estimate by 0.7%. Revenues improved 8.8% on a year-over-year basis and 9.4% at constant currency (cc).

2019 at a Glance

In 2019, the company reported revenues worth $14.88 billion, which beat the Zacks Consensus Estimate. On a year-over-year basis, the top line improved 9.4%.

Adjusted EPS for the year was $8.26, which beat the Zacks Consensus Estimate by 0.4%. The bottom line also increased 13% year over year.

Stryker Corporation Price, Consensus and EPS Surprise

Stryker Corporation Price, Consensus and EPS Surprise

Stryker Corporation price-consensus-eps-surprise-chart | Stryker Corporation Quote

Revenues by Geography

Revenues in United States came in at $3.04 billion, up 9.8% year over year. International sales were up 6.2% to $1.09 billion.

U.S. organic sales improved 8.2% and international organic sales came in at 7.6%. While solid performance across Orthopaedics, MedSurg and Neurotechnology segments drove growth in the United States, robust gains in emerging markets, Europe, Japan and Canada led to higher international organic sales.

Segmental Analysis

Orthopaedic: In the quarter under review, revenues in the segment totaled $1.47 billion, up 6.7% year over year. The segments revenues improved 7.3% at cc. The performance can be attributed to better results at the Knees, Hips and Other sub segments. The company continues to witness solid demand for Mako TKA (Total Knee Arthoplasty) platform or cementless knee and other 3D printed products.

MedSurg: This segment reported sales of $1.84 billion, up 6.8% year over year. Sales at the segment increased 7.4% at cc. Per management, the segment improved 6.8% organically in the reported quarter, led by strong Endoscopy, Instruments and Medical performances.

Neurotechnology & Spine: Sales in the segment amounted to $827 million, up 18% year over year and 18.2% at cc. Organically, the segment witnessed growth of 12.5%. Per management, the upside was driven by solid performance by the neurotech product lines.

Margins

In the fourth quarter, gross profit totaled $2.70 billion, up 10.1% from the year-ago quarter. Adjusted gross margin was 66.3%, up 60 bps.

Operating income totaled $944 million, up 35.2% from the prior-year quarter. Adjusted operating margin was 28.3%, up 80 bps.

Financial Update

Cash and cash equivalents came in at $4.34 billion, up 19.9% from the year-ago quarter.

Cash flow from operating activities as of Dec 31, 2019, came in at $2.19 billion, down 16.1% from the year-ago period.

2020 Outlook

Stryker expects 2020 organic net sales growth to be in the range of 6.5-7.5%.

On a full-year basis, adjusted EPS is expected in the band of $9.00 to $9.20. The Zacks Consensus Estimate is pegged at $9.03, within the companys guided range.

For first-quarter 2020, adjusted EPS is anticipated within $2.05 and $2.10. The Zacks Consensus Estimate stands at $2.05, within the companys projected range.

Wrapping Up

Stryker exited fourth-quarter 2019 on a strong note, with both earnings and revenues beating their respective consensus marks. The company continues to gain from its core MedSurg unit that put up a strong show in the reported quarter. Additionally, strength in flagship Mako platform continues to favor the company. Moreover, solid performance by the neurotech product lines drove the core Neurotechnology & Spine unit in the quarter under review. Solid international growth also buoys optimism. Expansion in operating margin is a positive while a strong outlook for 2020 is indicative of bright prospects.

However, pricing pressure continues to plague Stryker. Stiff competition in the MedTech space also remains a concern.

Zacks Rank

Stryker currently carries a Zacks Rank #3 (Hold).

Key Picks

Some better-ranked stocks in the broader medical space are SeaSpine Holdings Corporation SPNE, STERIS plc STE and DexCom, Inc. DXCM, all three carrying a Zacks Rank of 2 (Buy). You can see the complete list of todays Zacks #1 (Strong Buy) Rank stocks here.

The Zacks Consensus Estimate for SeaSpines fourth-quarter 2019 revenues is pegged at $43.6 million, suggesting growth of 14.7% from the prior-year reported figure. The same for loss per share is anticipated at 44 cents, indicating an improvement of 16.9% from the year-ago reported figure.

The Zacks Consensus Estimate for STERISs third-quarter fiscal 2020 revenues is pegged at $749.7 million, indicating an improvement of 7.7% from the year-earlier reported figure. The same for adjusted earnings per share stands at $1.43, indicating growth of 13.5% from the year-ago reported figure.

The Zacks Consensus Estimate for DexComs fourth-quarter 2019 revenues is pegged at $457 million, suggesting growth of 35.2% from the prior-year reported figure. The same for adjusted earnings per share stands at 72 cents, indicating an improvement 33.3% from the prior-year reported figure.

Story continues

Go here to see the original:

Stryker (SYK) Q4 Earnings and Revenues Surpass Estimates - Yahoo Finance

Stryker Announces Q4/Full Year 2019 Financial Results and 2020 Outlook – Financialbuzz.com

Stryker is one of the worlds leading medical technology companies and,together with its customers, is driven to make healthcare better. The company offers innovative products and services inOrthopaedics, Medical and Surgical,and Neurotechnology and Spine that help improvepatient and hospital outcomes.

Stryker (NYSE: SYK) announced fourth quarter and full year 2019 financial results and 2020 guidance. For the fourth quarter, net sales reached USD 4.1 Billion an 8.8% increase as organic net sales increased by 8%. For the full year 2019, net sales amounted to USD 14.9 Billion, a 9.4% increase.

We had an excellent finish to 2019, achieving 8.1% full-year organic sales growth and 13% adjusted EPS gains.This marks our seventh consecutive year of accelerating organic sales growth and is a testament to our talent, culture and durable operating model, said Kevin Lobo, Chairman and Chief Executive Officer. The performance was balanced across businesses and geographies and positions us well for continued success.

Net earnings for the fourth quarter and full year reached USD 725 Million and USD 2.1 Billion, falling 64.9% and 41.4%. Net sales growth in 2020 is expected to range from 6.5% to 7.5% while adjusted net earnings per diluted share is expected to range from USD 9.00 to USD 9.20. For the first quarter of 2020, adjusted net earnings per diluted share is anticipated to range from USD 2.05 to USD 2.10.

More here:

Stryker Announces Q4/Full Year 2019 Financial Results and 2020 Outlook - Financialbuzz.com

CES 2020: The symbiosis of human and machine – The Drum

Humans are the reproductive organs of technology.

- Kevin Kelly, What Technology Wants

...and if CES 2020 was anything to go by, humans across the globe have been hard at it, before giving birth to some extraordinary tech (as featured) in the Nevada desert.

Technology is a weird thing to celebrate. Actually, to celebrate technology is ultimately to celebrate humans and human ability. CES 2020 was the absolute in such a celebration, a holographic window into the magic we can build when we come together.

Of course, collaborations have always been the fuel of technological progress, and there have been incredible examples of this in previous years at CES, but this year the onus seemed to be on perhaps the greatest collaboration: the symbiosis of human and machine.

Such symbiosis enables human thoughts to play Duck Hunt and unpick visual number codes onscreen using a mind-blowing brain-computer interface from NextMind, where the future of neurotechnology is seemingly happening now.

Similarly, Virtual Touch means that touch is no longer required for interactive digital screens when using next-gen gesture control from V-Touch. A further example is next level AI-powered avatars in the form of Artificial Humans from the Samsung STARLabs Neon project, which were both impressive and slightly unnerving.

STAR Labs' CEO Pranav Mistry said Neons will integrate with our world and serve as new links to a better future, a world where 'humans are humans' and 'machines are humane', and suddenly the world felt a little like Black Mirror.

But perhaps the most exciting aspect of this expo was the clear agenda for taking this symbiosis, making it aesthetically pleasing and using it for good.

(It was a shame however that the event itself didnt have a similar ethos: I saw no recycling units, lots of single use plastic, no alternative power charge up sources, etc etc. With Elon Musks announcement perhaps a few low-tech-but-highly-usable recycling bins can also be rolled out?)

The purpose behind tech

There were too many examples of purpose-driven tech to mention every one, but some of the highlights were: the Vision AVTR - a beautiful visualisation of the Mercedes-Benz concept car dream of 2039 (launched by James Cameron himself at the start of the week) built by MB and the design team behind the Avatar movie. Theyve created a vehicle inspired by the relationship between humans, technology and the natural world - mirroring the same premise as the Pandora/Navi concept; the car only responds to the drivers heartbeat and breathing pattern, has recycled interiors, organic battery technology, a projection to replace the dashboard, and was essentially utterly magical.

Toyota launched plans for its hydrogen-powered Woven City; a living lab for understanding a true smart city and the requisite infrastructure and technologies needed to build a harmonious and sustainable human habitat. And this isnt just an abstract dream - earth will be broken on the site (at the foot of Mount Fuji) in 2021 - and applications for residency are open!

Honda and its innovation incubator (Xcelerator) showcased the Skelex ergo-skeleton, a mechanised wearable skeleton that provides support and structure to limbs (and gives an additional 4kg power per arm, so essentially gives you superhero powers).

Samsung, LG, and General Electric all showcased impressive examples of indoor LED farming solutions; setting out a firm vision of reducing food imports and thus carbon emissions in a future where we can grow our own produce in the home using aeroponics, hydroponics and soil.

And our world of OOH technology has an opportunity and an obligation to follow suit and be better for everyone. We need to imagine more ways, better ways to incorporate technologies into our public space infrastructure by collaborating with different disciplines to ensure more sustainable, more creative, and more useful outdoor advertising.

AR suggests multiple uses in this space, radar/lidar/sonar has a potential role to play, visual technologies such as volumetrics/eye sensing light field display/holographics can operate with a low cost energy solution, kinetic energy conversion and pollution eating algae should be commonplace, living formats are under-utilised, and so on and so forth. The opportunities are as endless as the possibilities.

But we can only get there with meaningful tech collaborations. And lots of big dreaming.

So, thanks CES, for the inspiration. And the jetlag. And the blisters. See you next time.

// Featured in this article

Posterscope

Posterscope is the worlds leading out-of-home and location marketing specialist with billings in excess of $3 billion. It knows more about what people think, feel and do out of the home and transla...

Excerpt from:

CES 2020: The symbiosis of human and machine - The Drum

Bitcoin.com withdraws from Bitcoin Cash proposal to divert part of block rewards to dev fund – Yahoo Finance

Roger Ver's mining pool Bitcoin.com has decided to not support a proposal that aims to redirect 12.5% of Bitcoin Cash block rewards to a development fund.

In a blog post on Tuesday, Bitcoin.com said it will "not go through with supporting any plan unless there is more agreement in the ecosystem such that the risk of a chain split is negligible."

"Bitcoin.com will not risk a chain split or a change to the underlying economics. In order to do this, any proposal will need to have as many people of economic weight on-board as possible, including businesses, exchanges, miners, and Bitcoin Cash implementations," the blog post stated.

Last week, mining pool BTC.TOP CEO Jiang Zhuoer announced the block reward cut petition in a blog post, stating that the move was to support the development of Bitcoin Cash infrastructure and threatening to orphan blocks that do not go with the proposal.

At the time, Bitcoin.com, Antpool, BTC.com, and ViaBTC all signed Jiang's petition, representing around 31.6% of the total Bitcoin Cash's hash rate. Bitcoin.com accounts for roughly 0.39% of the total Bitcoin Cash hash rate.

Read more:

Bitcoin.com withdraws from Bitcoin Cash proposal to divert part of block rewards to dev fund - Yahoo Finance

Transformation, Domination, Evasion, and 20 Crypto Jokes – Cryptonews

This week in crypto, experts at the WEF said that digital finance, crypto and blockchain-powered economics are sweeping through markets in the developing world unchallenged and that digital tokens could transform the world of business and art. While Binance aims to find more merchants for its P2P crypto trading platform, a resurfaced change in Binance whitepaper sparked transparency questions. We learned that Bitcoin's a step closer to Taproot; Bitcoin ETF are still far away; CME doubles Bitcoin options volume, dominating over Bakkt; Ripple discussed an IPO again and reduced sales of XRP from its escrow fund; developers believe that Ethereum real-world blockchain applications could come too late; BTC may have already entered the fourth bullish price cycle, and Block.one won't be launching Voice on EOS blockchain. Crypto derivatives market may be 'double the size of spot market in 2020, and 1.5 billion people might have CBDC in their wallets in 3 years.

Uzbekistan may see tax authorities waive taxes on crypto trading, Venezuela is in talks with Cuba about using the Petro, the South Korean government is reviewing a proposal that could see it introduce a tax on cryptos, and new Russian Prime Minister seems to want the government to adopt a crypto tax law by spring , but South Korean bankers are also concerned about Starbucks crypto threat. Lithuanian central bank says its limited edition digital LBcoin is ready for launch this spring, and Britains highest tax authority is cracking down on crypto tax evasion. Cryptoverse was stunned as a mining pool group signs a threat to Bitcoin Cash miners, Ethereum officially started the One Million Developer challenge, and Bitcoin critic Peter Schiff started another 'Proof of Keys' day, with the mystery of lost BTC solved soon after. Meanwhile, a Mexican firm offered crypto for the presidential plane, while Jihan Wu and Roger Ver stunned the Cryptoverse.

And now a carefully selected collection of jokes.

__________

Jus' sayin'.

__

Going in the opposite direction now.

__

Watch the latest reports by Block TV.

Because memes are both funny and simple. Here's an example.

__

This week demands an obligatory Schiff meme.

__

No. Two, at the very least.

__

Crypto hodlers after reading Schiff's tweets.

__

Is she trying to relate or out-tragic him?

__

You know it's true.

__

Yoouuu son of a bitcoin!

__

Veterans smelling an incoming scam shitcoin attack from a mile away.

__

And it flows into the ocean of my tears.

__

Living dangerously - short version.

__

Living dangerously - long version.

__

When you just want to explain.

__

Preparation 101.

__

Phase 1 of Bitcoin owning. Proceed to phase 2.

__

Phase 2 of Bitcoin owning. Return to phase 1.

__

Tough position.

__

Perfect and scientific.

__

"That is ridiculous!"

View post:

Transformation, Domination, Evasion, and 20 Crypto Jokes - Cryptonews

Competing Bitcoin Cash Mining Pool Averted ‘For the Time Being’ – Cryptonews

Source: iStock/Vitalij Sova

A pool of anonymous Bitcoin Cash (BCH) miners decided not to start the competing pool for the time being and will continue to support the BCH pools after supporters of the controversial proposal clarified their ideas.

The group believes that Bitcoin.com, an operator of a BCH mining pool, will convince other signatories to severely amend the proposal.

"We would also like to thank the community to be able to have such a civilized discussion over this issue," the European and North American miners added.

At pixel time (12:14 PM UTC), BCH trades at c. USD 368 and is down 0.5% in a day and up 7% in a week.

As reported, BTC.TOPs CEO Jiang Zhuoer offered to direct 12.5% of mining rewards to support the BCH infrastructure over a six-month period. Zhuoer's post included a threat to orphan the noncomplying BCH blocks (those blocks would no longer be included in the BCH blockchain, leaving a miner without the reward).

Meanwhile, Bitcoin.com, who also signed the proposal, later clarified that the proposal is not a tax, but "a service fee for the miners," as well as a temporary and reversible plan one still in development, with many questions waiting for the community to discuss them. The majority of the funds will not come through as an additional cost for existing BCH miners much of it will be paid for by Bitcoin miners and through the minor decrease in hash rate on BCH, it says.

It is a discussion at this point, so people should be free to discuss it however they would like, Roger Ver, CEO of Bitcoin.com, told Cryptonews.com, adding that "most of the BCH supporting miners mine BTC most of the time. As to how he came to be a signatory on this proposal, he said: I agreed to start the discussion for this proposal. Discussing and doing are two different things.

Meanwhile, the anonymous group of miners previously threatened that should the proposal go through as is, the group would launch a competing BCH pool, voluntarily donating 1% of income to development teams. We will continue to mine up to the hard fork, which will create our own chain after the fork due to the consensus rule change introduced by the signatories, said the miners before changing their minds.

Original post:

Competing Bitcoin Cash Mining Pool Averted 'For the Time Being' - Cryptonews

Issue 35 Of CoinMetrics State Of The Network Re-Examines Largest Bitcoin Hacks In Its History – The Coin Republic

The 35th publication of CoinMetrics State Of The Network discusses the most massive hacks in the history of Bitcoin and the consequences they had on the worlds most popular cryptocurrency.

Some of the most significant disturbances in the Bitcoin ecosystem brought by the cyber-crimes discussed in this article, namely the Bitcoinica, Mt. Gox, Bitfinex, and Binance hack. These have had the largest fallouts within the Bitcoin community.

The Bitcoinica incident is the first one discussed in the article, and CoinMetrics sees it as the most influential violation of the Bitcoin market of all time. Bitcoinica was a trading platform that launched in 2011 and gained a large amount of support from the community during the early times.

But during 2012, the exchange ended up suffering several cascading compromises in its system that lead to a large amount of Bitcoin worth a near estimate of up to $650 million stolen.

The first failure that began the domino effect across a span fo just five months was the failure of the web host that Bitcoinica used: Linode. Bitcoinicas server targeted by an attacker who used a compromised Linode web portal and their wallet emptied draining 43,554 BTC ($213,886).

Then a few weeks later, the exchanges hot wallet was exploited, ending up with another 18,547 BTC ($92,061) stolen. Then, soon, Bitcoinicas source code was also leaked, and their old Mt. Gox API key unveiled.

Bitcoinica held some of their BTC on Mt. Gox, and their API key theft resulted in another 40,000 BTC ($305,236) stolen along with another $40,000 in cash. Luckily enough, Bitcoin prices continued rallying mostly ignorant to the hacks. Roger Ver, an early investor in Bitcoin, had about 24k BTC holdings in Bitcoinica and lost the most during the hacks.

Bitcoinicas source code leaks also brought forth many new exchanges that function successfully to this day that built using Bitcoinicas bases. The sudden death of a successful transaction also opened up space to plenty more competitors in the area.

The second major incident discussed in the report is the Mt. Gox hack. It was one of the most popular exchanges as Bitcoin began gaining traction in a global market and had a large trade volume for Bitcoin from 2010 to 2013.

Unlike most other transactions, Mt.Gox didnt go through a single devastating hack that ruined its industry presence. Throughout its life span, Mt. Gox suffered several hacks that eventually led to its demise in 2014.

The first hack saw 79,956 BTC ($70,000) taken out of Mt. Goxs wallet in 2011. Later during the same year, their hot wallet was hacked and gradually drained by the hacker without being noticed by the exchange.

By 2013, Mt. Gox had no more Bitcoin left to be stolen apart from that they held in cold storage. The exciting part is that the public became utterly aware of all the cascading hacks Mt. Gox had gone through, only in 2014, when the exchange stopped withdrawals. The prices of Bitcoin acted very volatile to these events and crashed by hundreds of dollars following awareness of what had happened.

The Mt. Gox hack crippled the price of Bitcoin severely during the period, and the cryptocurrency took more than three years to reach another high after the hack. But the entire incident and the number of failures that Mt. Gox endured brought full media attention and therefore introduced a large number of the population to Bitcoin.

The entire hack also affected the public outlook on Bitcoin severely as most people lost faith in what the cryptocurrency could be due to such vulnerabilities in the entities that were supposed to manage them.

The third major hack discussed in the CoinMetrics report is the Bitfinex exploit. Bitfinex is one of the exchanges that was built ground up using the leaked Bitcoinica source code and remains a viral and strong moving exchange to this date despite the hack.

In 2016, 119,756 Bitcoin stolen due to an API key compromised. The amount of BTC stolen was first unclear until on-chain analysis made to get more definite results. Bitcoin prices dropped by $200 right after the hack, but it got back up to pace in the coming months even though Bitfinex lost about 36% of its cash reserve.

The exchange did recover from the incident by cutting 36% of all balances from user accounts and offering them a BFX token for every dollar stolen or the option to convert their lost holdings into shares in their parent company iFinex Inc.

The final hack to be featured on the issue was the Binance hack last year in 2019 when 7,500 BTC withdrawn from their hot wallet. The hackers exploited various retail accounts to break through the withdrawal limit and fool Binances hot wallet processing system.

Although it posed a considerable loss for the exchange, they were well prepared to face such a situation with a Safe Asset Fund that they stored separately in cold storage from 10% of all their trading fees.

This allowed them to avoid insolvency during the epilogue of the theft. The hack also turned out to have little to no impact on the trading price of Bitcoin at the time as the cryptocurrency continued to rally into the $7,000 range after news of the hack.

The Binance hack was very complicated and well-orchestrated and yet so posed minimal risk to the market as the company managed to recover quickly from the hack.

This clearly shows that exchanges have come a long way since Bitoinica, which wholly dissolved and have engaged in bankruptcy proceedings to this date. Transactions have become more secure, and each hack has proven as a milestone in the growth of digital assets, although it has been a sufferable event for the victims.

See more here:

Issue 35 Of CoinMetrics State Of The Network Re-Examines Largest Bitcoin Hacks In Its History - The Coin Republic

What is quantum cognition? Physics theory could predict human behavior. – Livescience.com

The same fundamental platform that allows Schrdinger's cat to be both alive and dead, and also means two particles can "speak to each other" even across a galaxy's distance, could help to explain perhaps the most mysterious phenomena: human behavior.

Quantum physics and human psychology may seem completely unrelated, but some scientists think the two fields overlap in interesting ways. Both disciplines attempt to predict how unruly systems might behave in the future. The difference is that one field aims to understand the fundamental nature of physical particles, while the other attempts to explain human nature along with its inherent fallacies.

"Cognitive scientists found that there are many 'irrational' human behaviors," Xiaochu Zhang, a biophysicist and neuroscientist at the University of Science and Technology of China in Hefei, told Live Science in an email. Classical theories of decision-making attempt to predict what choice a person will make given certain parameters, but fallible humans don't always behave as expected. Recent research suggests that these lapses in logic "can be well explained by quantum probability theory," Zhang said.

Related: Twisted Physics: 7 Mind-Blowing Findings

Zhang stands among the proponents of so-called quantum cognition. In a new study published Jan. 20 in the journal Nature Human Behavior, he and his colleagues investigated how concepts borrowed from quantum mechanics can help psychologists better predict human decision-making. While recording what decisions people made on a well-known psychology task, the team also monitored the participants' brain activity. The scans highlighted specific brain regions that may be involved in quantum-like thought processes.

The study is "the first to support the idea of quantum cognition at the neural level," Zhang said.

Cool now what does that really mean?

Quantum mechanics describes the behavior of the tiny particles that make up all matter in the universe, namely atoms and their subatomic components. One central tenet of the theory suggests a great deal of uncertainty in this world of the very small, something not seen at larger scales. For instance, in the big world, one can know where a train is on its route and how fast it's traveling, and given this data, one could predict when that train should arrive at the next station.

Now, swap out the train for an electron, and your predictive power disappears you can't know the exact location and momentum of a given electron, but you could calculate the probability that the particle may appear in a certain spot, traveling at a particular rate. In this way, you could gain a hazy idea of what the electron might be up to.

Just as uncertainty pervades the subatomic world, it also seeps into our decision-making process, whether we're debating which new series to binge-watch or casting our vote in a presidential election. Here's where quantum mechanics comes in. Unlike classical theories of decision-making, the quantum world makes room for a certain degree of uncertainty.

Related: The Funniest Theories in Physics

Classical psychology theories rest on the idea that people make decisions in order to maximize "rewards" and minimize "punishments" in other words, to ensure their actions result in more positive outcomes than negative consequences. This logic, known as "reinforcement learning," falls in line with Pavlonian conditioning, wherein people learn to predict the consequences of their actions based on past experiences, according to a 2009 report in the Journal of Mathematical Psychology.

If truly constrained by this framework, humans would consistently weigh the objective values of two options before choosing between them. But in reality, people don't always work that way; their subjective feelings about a situation undermine their ability to make objective decisions.

Consider an example:

Imagine you're placing bets on whether a tossed coin will land on heads or tails. Heads gets you $200, tails costs you $100, and you can choose to toss the coin twice. When placed in this scenario, most people choose to take the bet twice regardless of whether the initial throw results in a win or a loss, according to a study published in 1992 in the journal Cognitive Psychology. Presumably, winners bet a second time because they stand to gain money no matter what, while losers bet in attempt to recover their losses, and then some. However, if players aren't allowed to know the result of the first coin flip, they rarely make the second gamble.

When known, the first flip does not sway the choice that follows, but when unknown, it makes all the difference. This paradox does not fit within the framework of classical reinforcement learning, which predicts that the objective choice should always be the same. In contrast, quantum mechanics takes uncertainty into account and actually predicts this odd outcome.

"One could say that the 'quantum-based' model of decision-making refers essentially to the use of quantum probability in the area of cognition," Emmanuel Haven and Andrei Khrennikov, co-authors of the textbook "Quantum Social Science" (Cambridge University Press, 2013), told Live Science in an email.

Related: The 18 Biggest Unsolved Mysteries in Physics

Just as a particular electron might be here or there at a given moment, quantum mechanics assumes that the first coin toss resulted in both a win and a loss, simultaneously. (In other words, in the famous thought experiment, Schrdinger's cat is both alive and dead.) While caught in this ambiguous state, known as "superposition," an individual's final choice is unknown and unpredictable. Quantum mechanics also acknowledges that people's beliefs about the outcome of a given decision whether it will be good or bad often reflect what their final choice ends up being. In this way, people's beliefs interact, or become "entangled," with their eventual action.

Subatomic particles can likewise become entangled and influence each other's behavior even when separated by great distances. For instance, measuring the behavior of a particle located in Japan would alter the behavior of its entangled partner in the United States. In psychology, a similar analogy can be drawn between beliefs and behaviors. "It is precisely this interaction," or state of entanglement, "which influences the measurement outcome," Haven and Khrennikov said. The measurement outcome, in this case, refers to the final choice an individual makes. "This can be precisely formulated with the aid of quantum probability."

Scientists can mathematically model this entangled state of superposition in which two particles affect each other even if theyre separated by a large distance as demonstrated in a 2007 report published by the Association for the Advancement of Artificial Intelligence. And remarkably, the final formula accurately predicts the paradoxical outcome of the coin toss paradigm. "The lapse in logic can be better explained by using the quantum-based approach," Haven and Khrennikov noted.

In their new study, Zhang and his colleagues pitted two quantum-based models of decision-making against 12 classical psychology models to see which best predicted human behavior during a psychological task. The experiment, known as the Iowa Gambling Task, is designed to evaluate people's ability to learn from mistakes and adjust their decision-making strategy over time.

In the task, participants draw from four decks of cards. Each card either earns the player money or costs them money, and the object of the game is to earn as much money as possible. The catch lies in how each deck of cards is stacked. Drawing from one deck may earn a player large sums of money in the short term, but it will cost them far more cash by the end of the game. Other decks deliver smaller sums of money in the short-term, but fewer penalties overall. Through game play, winners learn to mostly draw from the "slow and steady" decks, while losers draw from the decks that earn them quick cash and steep penalties.

Historically, those with drug addictions or brain damage perform worse on the Iowa Gambling Task than healthy participants, which suggests that their condition somehow impairs decision-making abilities, as highlighted in a study published in 2014 in the journal Applied Neuropsychology: Child. This pattern held true in Zhang's experiment, which included about 60 healthy participants and 40 who were addicted to nicotine.

The two quantum models made similar predictions to the most accurate among the classical models, the authors noted. "Although the [quantum] models did not overwhelmingly outperform the [classical] ... one should be aware that the [quantum reinforcement learning] framework is still in its infancy and undoubtedly deserves additional studies," they added.

Related: 10 Things You Didn't Know About the Brain.

To bolster the value of their study, the team took brain scans of each participant as they completed the Iowa Gambling Task. In doing so, the authors attempted to peek at what was happening inside the brain as participants learned and adjusted their game-play strategy over time. Outputs generated by the quantum model predicted how this learning process would unfold, and thus, the authors theorized that hotspots of brain activity might somehow correlate with the models' predictions.

The scans did reveal a number of active brain areas in the healthy participants during game play, including activation of several large folds within the frontal lobe known to be involved in decision-making. In the smoking group, however, no hotspots of brain activity seemed tied to predictions made by the quantum model. As the model reflects participants' ability to learn from mistakes, the results may illustrate decision-making impairments in the smoking group, the authors noted.

However, "further research is warranted" to determine what these brain activity differences truly reflect in smokers and non-smokers, they added. "The coupling of the quantum-like models with neurophysiological processes in the brain ... is a very complex problem," Haven and Khrennikov said. "This study is of great importance as the first step towards its solution."

Models of classical reinforcement learning have shown "great success" in studies of emotion, psychiatric disorders, social behavior, free will and many other cognitive functions, Zhang said. "We hope that quantum reinforcement learning will also shed light on [these fields], providing unique insights."

In time, perhaps quantum mechanics will help explain pervasive flaws in human logic, as well as how that fallibility manifests at the level of individual neurons.

Originally published on Live Science.

View original post here:

What is quantum cognition? Physics theory could predict human behavior. - Livescience.com

A Tiny Glass Bead Goes as Still as Nature Allows – WIRED

Inside a small metal box on a laboratory table in Vienna, physicist Markus Aspelmeyer and his team have engineered, perhaps, the quietest place on earth.

The area in question is a microscopic spot in the middle of the box. Here, levitating in midairexcept there is no air because the box is in a vacuumis a tiny glass bead a thousand times smaller than a grain of sand. Aspelmeyers apparatus uses lasers to render this bead literally motionless. It is as still as it could possibly be, as permitted by the laws of physics: Its reached what physicists call the beads motional ground state. The ground state is the limit where you cannot extract any more energy from an object, says Aspelmeyer, who works at the University of Vienna. They can maintain the beads motionlessness for hours at a time.

This stillness is different from anything youve ever perceivedoverlooking that lake in the mountains, sitting in a sound-proofed studio, or even just staring at your laptop as it rests on the table. As calm as that table seems, if you could zoom in on it, you would see its surface being attacked by air molecules that circulate via your ventilation system, says Aspelmeyer. Look hard enough and youll see microscopic particles or tiny pieces of lint rolling around. In our day-to-day lives, stillness is an illusion. Were simply too large to notice the chaos.

Kahan Dare and Manuel Reisenbauer, physicists at the University of Vienna, adjust the apparatus where the levitated nanoparticle sits.

But this bead is truly still, regardless of whether you are judging it as a human or a dust mite. And at this level of stillness, our conventional wisdom about motion breaks down, as the bizarre rules of quantum mechanics kick in. For one thing, the bead becomes delocalized, says Aspelmeyer. The bead spreads out. It no longer has a definite positionlike a ripple in a pond, which stretches over an expanse of water rather than being at a particular location. Instead of maintaining a sharp boundary between bead and vacuum, the beads outline becomes cloudy and diffuse.

Technically, although the bead is at the limit of its motionlessness, it still moves about a thousandth of its own diameter. Physicists have a cool name for it. Its called the vacuum energy of the system, says Aspelmeyer. Put another way, nature does not allow any object to have completely zero motion. There must always be some quantum jiggle.

The beads stillness comes with another caveat: Aspelmeyers team has only forced the bead into its motional ground state along one dimension, not all three. But even achieving this level of stillness took them 10 years. One major challenge was simply getting the bead to stay levitated inside the laser beam, says physicist Uro Deli of the University of Vienna. Deli has worked on the experiment since its nascencefirst as an undergraduate student, then a PhD student, and now as a postdoc researcher.

Read more here:

A Tiny Glass Bead Goes as Still as Nature Allows - WIRED

New Centers Lead the Way towards a Quantum Future – Energy.gov

The world of quantum is the world of the very, very small. At sizes near those of atoms and smaller, the rules of physics start morphing into something unrecognizableat least to us in the regular world. While quantum physics seems bizarre, it offers huge opportunities.

Quantum physics may hold the key to vast technological improvements in computing, sensing, and communication. Quantum computing may be able to solve problems in minutes that would take lifetimes on todays computers. Quantum sensors could act as extremely high-powered antennas for the military. Quantum communication systems could be nearly unhackable. But we dont have the knowledge or capacity to take advantage of these benefitsyet.

The Department of Energy (DOE) recently announced that it will establish Quantum Information Science Centers to help lay the foundation for these technologies. As Congress put forth in the National Quantum Initiative Act, the DOEs Office of Science will make awards for at least two and up to five centers.

These centers will draw on both quantum physics and information theory to give us a soup-to-nuts understanding of quantum systems. Teams of researchers from universities, DOE national laboratories, and private companies will run them. Their expertise in quantum theory, technology development, and engineering will help each center undertake major, cross-cutting challenges. The centers work will range from discovery research up to developing prototypes. Theyll also address a number of different technical areas. Each center must tackle at least two of these subjects: quantum communication, quantum computing and emulation, quantum devices and sensors, materials and chemistry for quantum systems, and quantum foundries for synthesis, fabrication, and integration.

The impacts wont stop at the centers themselves. Each center will have a plan in place to transfer technologies to industry or other research partners. Theyll also work to leverage DOEs existing facilities and collaborate with non-DOE projects.

As the nations largest supporter of basic research in the physical sciences, the Office of Science is thrilled to head this initiative. Although quantum physics depends on the behavior of very small things, the Quantum Information Science Centers will be a very big deal.

The Office of Science is the single largest supporter of basic research in the physical sciences in the United States and is working to address some of the most pressing challenges of our time. For more information, please visit https://www.energy.gov/science.

Read the original:

New Centers Lead the Way towards a Quantum Future - Energy.gov

Scientists cooled a nanoparticle to the quantum limit – Science News

A tiny nanoparticle has been chilled tothe max.

Physicists cooled a nanoparticle to thelowest temperature allowed by quantum mechanics. The particles motion reachedwhats known as the ground state, or lowest possible energy level.

In a typical material, the amount thatits atoms jostle around indicates its temperature. But in the case of thenanoparticle, scientists can define an effective temperature based on themotion of the entire nanoparticle, which is made up of about 100 million atoms.That temperature reached twelve-millionths of a kelvin, scientists reportJanuary 30 in Science.

Levitating it with a laser inside of aspecially designed cavity, Markus Aspelmeyer of the University of Vienna andcolleagues reduced the nanoparticles motion to the ground state, a minimum level set by theHeisenberg uncertainty principle, which states that theres a limit to how wellyou can simultaneously know the position and momentum of an object.

While quantum mechanics is unmistakablein tiny atoms and electrons, its effects are harder to observe on larger scales.To better understand the theory, physicists have previously isolated its effects in other solid objects, such as vibrating membranes or beams (SN: 4/25/18). But nanoparticles have theadvantage that they can be levitated and precisely controlled with lasers.

Eventually, Aspelmeyer and colleaguesaim to use cooled nanoparticles to study how gravity behaves for quantumobjects, a poorly understood realm of physics. This is the really long-termdream, he says.

The rest is here:

Scientists cooled a nanoparticle to the quantum limit - Science News

What Is Quantum Computing and How Does it Work? – Built In

Accustomed to imagining worst-case scenarios, many cryptography experts are more concerned than usual these days: one of the most widely used schemes for safely transmitting data is poised to become obsolete once quantum computing reaches a sufficiently advanced state.

The cryptosystem known as RSA provides the safety structure for a host of privacy and communication protocols, from email to internet retail transactions. Current standards rely on the fact that no one has the computing power to test every possible way to de-scramble your data once encrypted, but a mature quantum computer could try every option within a matter of hours.

It should be stressed that quantum computers havent yet hit that level of maturity and wont for some time but when a large, stable device is built (or if its built, asan increasingly diminishing minority argue), its unprecedented ability to factor large numbers would essentially leave the RSA cryptosystem in tatters. Thankfully, the technology is still a ways away and the experts are on it.

Dont panic. Thats what Mike Brown, CTO and co-founder of quantum-focused cryptography company ISARA Corporation, advises anxious prospective clients. The threat is far from imminent. What we hear from the academic community and from companies like IBM and Microsoft is that a 2026-to-2030 timeframe is what we typically use from a planning perspective in terms of getting systems ready, he said.

Cryptographers from ISARA are among several contingents currently taking part in the Post-Quantum Cryptography Standardization project, a contest of quantum-resistant encryption schemes. The aim is to standardize algorithms that can resist attacks levied by large-scale quantum computers. The competition was launched in 2016 by the National Institute of Standards and Technology (NIST), a federal agency that helps establish tech and science guidelines, and is now gearing up for its third round.

Indeed, the level of complexity and stability required of a quantum computer to launch the much-discussed RSA attack is very extreme, according to John Donohue, scientific outreach manager at the University of Waterloos Institute for Quantum Computing. Even granting that timelines in quantum computing particularly in terms of scalability are points of contention, the community is pretty comfortable saying thats not something thats going to happen in the next five to 10 years, he said.

When Google announced that it had achieved quantum supremacy or that it used a quantum computer to run, in minutes, an operation that would take thousands of years to complete on a classical supercomputer that machine operated on 54 qubits, the computational bedrocks of quantum computing. While IBMs Q 53 system operates at a similar level, many current prototypes operate on as few as 20 or even five qubits.

But how many qubits would be needed to crack RSA? Probably on the scale of millions of error-tolerant qubits, Donohue told Built In.

Scott Aaronson, a computer scientist at the University of Texas at Austin, underscored the same last year in his popular blog after presidential candidate Andrew Yang tweeted that no code is uncrackable in the wake of Googles proof-of-concept milestone.

Thats the good news. The bad news is that, while cryptography experts gain more time to keep our data secure from quantum computers, the technologys numerous potential upsides ranging from drug discovery to materials science to financial modeling is also largely forestalled. And that question of error tolerance continues to stand as quantum computings central, Herculean challenge. But before we wrestle with that, lets get a better elemental sense of the technology.

Quantum computers process information in a fundamentally different way than classical computers. Traditional computers operate on binary bits information processed in the form of ones or zeroes. But quantum computers transmit information via quantum bits, or qubits, which can exist either as one or zero or both simultaneously. Thats a simplification, and well explore some nuances below, but that capacity known as superposition lies at the heart of quantums potential for exponentially greater computational power.

Such fundamental complexity both cries out for and resists succinct laymanization. When the New York Times asked 10 experts to explain quantum computing in the length of a tweet, some responses raised more questions than they answered:

Microsoft researcher David Reilly:

A quantum machine is a kind of analog calculator that computes by encoding information in the ephemeral waves that comprise light and matter at the nanoscale.

D-Wave Systems executive vice president Alan Baratz:

If were honest, everything we currently know about quantum mechanics cant fully describe how a quantum computer works.

Quantum computing also cries out for a digestible metaphor. Quantum physicist Shohini Ghose, of Wilfrid Laurier University, has likened the difference between quantum and classical computing to light bulbs and candles: The light bulb isnt just a better candle; its something completely different.

Rebecca Krauthamer, CEO of quantum computing consultancy Quantum Thought, compares quantum computing to a crossroads that allows a traveler to take both paths. If youre trying to solve a maze, youd come to your first gate, and you can go either right or left, she said. We have to choose one, but a quantum computer doesnt have to choose one. It can go right and left at the same time.

It can, in a sense, look at these different options simultaneously and then instantly find the most optimal path, she said. That's really powerful.

The most commonly used example of quantum superposition is Schrdingers cat:

Despite its ubiquity, many in the QC field arent so taken with Schrodingers cat. The more interesting fact about superposition rather than the two-things-at-once point of focus is the ability to look at quantum states in multiple ways, and ask it different questions, said Donohue. That is, rather than having to perform tasks sequentially, like a traditional computer, quantum computers can run vast numbers of parallel computations.

Part of Donohues professional charge is clarifying quantums nuances, so its worth quoting him here at length:

In superposition I can have state A and state B. I can ask my quantum state, are you A or B? And it will tell me, I'm a or I'm B. But I might have a superposition of A + B in which case, when I ask it, Are you A or B? Itll tell me A or B randomly.

But the key of superposition is that I can also ask the question, Are you in the superposition state of A + B? And then in that case, they'll tell me, Yes, I am the superposition state A + B.

But theres always going to be an opposite superposition. So if its A + B, the opposite superposition is A - B.

Thats about as simplified as we can get before trotting out equations. But the top-line takeaway is that that superposition is what lets a quantum computer try all paths at once.

Thats not to say that such unprecedented computational heft will displace or render moot classical computers. One thing that we can really agree on in the community is that it wont solve every type of problem that we run into, said Krauthamer.

But quantum computing is particularly well suited for certain kinds of challenges. Those include probability problems, optimization (what is, say, the best possible travel route?) and the incredible challenge of molecular simulation for use cases like drug development and materials discovery.

The cocktail of hype and complexity has a way of fuzzing outsiders conception of quantum computing which makes this point worth underlining: quantum computers exist, and they are being used right now.

They are not, however, presently solving climate change, turbocharging financial forecasting probabilities or performing other similarly lofty tasks that get bandied about in reference to quantum computings potential. QC may have commercial applications related to those challenges, which well explore further below, but thats well down the road.

Today, were still in whats known as the NISQ era Noisy, Intermediate-Scale Quantum. In a nutshell, quantum noise makes such computers incredibly difficult to stabilize. As such, NISQ computers cant be trusted to make decisions of major commercial consequence, which means theyre currently used primarily for research and education.

The technology just isnt quite there yet to provide a computational advantage over what could be done with other methods of computation at the moment, said Dohonue. Most [commercial] interest is from a long-term perspective. [Companies] are getting used to the technology so that when it does catch up and that timeline is a subject of fierce debate theyre ready for it.

Also, its fun to sit next to the cool kids. Lets be frank. Its good PR for them, too, said Donohue.

But NISQ computers R&D practicality is demonstrable, if decidedly small-scale. Donohue cites the molecular modeling of lithium hydrogen. Thats a small enough molecule that it can also be simulated using a supercomputer, but the quantum simulation provides an important opportunity to check our answers after a classical-computer simulation. NISQs have also delivered some results for problems in high-energy particle physics, Donohue noted.

One breakthrough came in 2017, when researchers at IBM modeled beryllium hydride, the largest molecule simulated on a quantum computer to date. Another key step arrived in 2019, when IonQ researchers used quantum computing to go bigger still, by simulating a water molecule.

These are generally still small problems that can be checked using classical simulation methods. But its building toward things that will be difficult to check without actually building a large particle physics experiment, which can get very expensive, Donohue said.

And curious minds can get their hands dirty right now. Users can operate small-scale quantum processors via the cloud through IBMs online Q Experience and its open-source software Quiskit. Late last year, Microsoft and Amazon both announced similar platforms, dubbed Azure Quantum and Braket. Thats one of the cool things about quantum computing today, said Krauthamer. We can all get on and play with it.

RelatedQuantum Computing and the Gaming Industry

Quantum computing may still be in its fussy, uncooperative stage, but that hasnt stopped commercial interests from diving in.

IBM announced at the recent Consumer Electronics Show that its so-called Q Network had expanded to more than 100 companies and organizations. Partners now range from Delta Air Lines to Anthem health to Daimler AG, which owns Mercedes-Benz.

Some of those partnerships hinge on quantum computings aforementioned promise in terms of molecular simulation. Daimler, for instance, is hoping the technology will one day yield a way to produce better batteries for electric vehicles.

Elsewhere, partnerships between quantum computing startups and leading companies in the pharmaceutical industry like those established between 1QBit and Biogen, and ProteinQure and AstraZeneca point to quantum molecular modelings drug-discovery promise, distant though it remains. (Today, drug development is done through expensive, relatively low-yield trial-and-error.)

Researchers would need millions of qubits to compute the chemical properties of a novel substance, noted theoretical physicist Sabine Hossenfelder in the Guardian last year. But the conceptual underpinning, at least, is there. A quantum computer knows quantum mechanics already, so I can essentially program in how another quantum system would work and use that to echo the other one, explained Donohue.

Theres also hope that large-scale quantum computers will help accelerate AI, and vice versa although experts disagree on this point. The reason theres controversy is, things have to be redesigned in a quantum world, said Krauthamer, who considers herself an AI-quantum optimist. We cant just translate algorithms from regular computers to quantum computers because the rules are completely different, at the most elemental level.

Some believe quantum computers can help combat climate change by improving carbon capture. Jeremy OBrien, CEO of Palo Alto-based PsiQuantum, wrote last year that quantum simulation of larger molecules if achieved could help build a catalyst for scrubbing carbon dioxide directly from the atmosphere.

Long-term applications tend to dominate headlines, but they also lead us back to quantum computings defining hurdle and the reason coverage remains littered with terms like potential and promise: error correction.

Qubits, it turns out, are higher maintenance than even the most meltdown-prone rock star. Any number of simple actions or variables can send error-prone qubits falling into decoherence, or the loss of a quantum state (mainly that all-important superposition). Things that can cause a quantum computer to crash include measuring qubits and running operations in other words: using it. Even small vibrations and temperature shifts will cause qubits to decohere, too.

Thats why quantum computers are kept isolated, and the ones that run on superconducting circuits the most prominent method, favored by Google and IBM have to be kept at near-absolute zero (a cool -460 degrees Fahrenheit).

Thechallenge is two-fold, according to Jonathan Carter, a scientist at Berkeley Quantum. First, individual physical qubits need to have better fidelity. That would conceivably happen either through better engineering, discovering optimal circuit layout, and finding the optimal combination of components. Second, we have to arrange them to form logical qubits.

Estimates range from hundreds to thousands to tens of thousands of physical qubits required to form one fault-tolerant qubit. I think its safe to say that none of the technology we have at the moment could scale out to those levels, Carter said.

From there, researchers would also have to build ever-more complex systems to handle the increase in qubit fidelity and numbers. So how long will it take until hardware-makers actually achieve the necessary error correction to make quantum computers commercially viable?

Some of these other barriers make it hard to say yes to a five- or 10-year timeline, Carter said.

Donohue invokes and rejects the same figure. Even the optimist wouldnt say its going to happen in the next five to 10 years, he said. At the same time, some small optimization problems, specifically in terms of random number generation could happen very soon.

Weve already seen some useful things in that regard, he said.

For people like Michael Biercuk, founder of quantum-engineering software company Q-CTRL, the only technical commercial milestone that matters now is quantum advantage or, as he uses the term, when a quantum computer provides some time or cost advantage over a classical computer. Count him among the optimists: he foresees a five-to-eight year time scale to achieve such a goal.

Another open question: Which method of quantum computing will become standard? While superconducting has borne the most fruit so far, researchers are exploring alternative methods that involve trapped ions, quantum annealing or so-called topological qubits. In Donohues view, its not necessarily a question of which technology is better so much as one of finding the best approach for different applications. For instance, superconducting chips naturally dovetail with the magnetic field technology that underpins neuroimaging.

The challenges that quantum computing faces, however, arent strictly hardware-related. The magic of quantum computing resides in algorithmic advances, not speed, Greg Kuperberg, a mathematician at the University of California at Davis, is quick to underscore.

If you come up with a new algorithm, for a question that it fits, things can be exponentially faster, he said, using exponential literally, not metaphorically. (There are currently 63 algorithms listed and 420 papers cited at Quantum Algorithm Zoo, an online catalog of quantum algorithms compiled by Microsoft quantum researcher Scott Jordan.)

Another roadblock, according to Krauthamer, is general lack of expertise. Theres just not enough people working at the software level or at the algorithmic level in the field, she said. Tech entrepreneur Jack Hidaritys team set out to count the number of people working in quantum computing and found only about 800 to 850 people, according to Krauthamer. Thats a bigger problem to focus on, even more than the hardware, she said. Because the people will bring that innovation.

While the community underscores the importance of outreach, the term quantum supremacy has itself come under fire. In our view, supremacy has overtones of violence, neocolonialism and racism through its association with white supremacy, 13 researchers wrote in Nature late last year. The letter has kickstarted an ongoing conversation among researchers and academics.

But the fields attempt to attract and expand also comes at a time of uncertainty in terms of broader information-sharing.

Quantum computing research is sometimes framed in the same adversarial terms as conversations about trade and other emerging tech that is, U.S. versus China. An oft-cited statistic from patent analytics consultancy Patinformatics states that, in 2018, China filed 492 patents related to quantum technology, compared to just 248 in the United States. That same year, the think tank Center for a New American Security published a paper that warned, China is positioning itself as a powerhouse in quantum science. By the end of 2018, the U.S. passed and signed into law the National Quantum Initiative Act. Many in the field believe legislators were compelled due to Chinas perceived growing advantage.

The initiative has spurred domestic research the Department of Energy recently announced up to $625 million in funding to establish up to five quantum information research centers but the geopolitical tensions give some in the quantum computing community pause, namely for fear of collaboration-chilling regulation. As quantum technology has become prominent in the media, among other places, there has been a desire suddenly among governments to clamp down, said Biercuk, who has warned of poorly crafted and nationalistic export controls in the past.

What they dont understand often is that quantum technology and quantum information in particular really are deep research activities where open transfer of scientific knowledge is essential, he added.

The National Science Foundation one of the government departments given additional funding and directives under the act generally has a positive track record in terms of avoiding draconian security controls, Kuperberg said. Even still, the antagonistic framing tends to obscure the on-the-ground facts. The truth behind the scenes is that, yes, China would like to be doing good research and quantum computing, but a lot of what theyre doing is just scrambling for any kind of output, he said.

Indeed, the majority of the aforementioned Chinese patents are quantum tech, but not quantum computing tech which is where the real promise lies.

The Department of Energy has an internal list of sensitive technologies that it could potentially restrict DOE researchers from sharing with counterparts in China, Russia, Iran and North Korea. It has not yet implemented that curtailment, however, DOE Office of Science director Chris Fall told the House committee on science, space and technology and clarified to Science, in January.

Along with such multi-agency-focused government spending, theres been a tsunami of venture capital directed toward commercial quantum-computing interests in recent years. A Nature analysis found that, in 2017 and 2018, private funding in the industry hit at least $450 million.

Still, funding concerns linger in some corners. Even as Googles quantum supremacy proof of concept has helped heighten excitement among enterprise investors, Biercuk has also flagged the beginnings of a contraction in investment in the sector.

Even as exceptional cases dominate headlines he points to PsiQuantums recent $230 million venture windfall there are lesser-reported signs of struggle. I know of probably four or five smaller shops that started and closed within about 24 months; others were absorbed by larger organizations because they struggled to raise, he said.

At the same time, signs of at least moderate investor agitation and internal turmoil have emerged. The Wall Street Journal reported in January that much-buzzed quantum computing startup Rigetti Computing saw its CTO and COO, among other staff, depart amid concerns that the companys tech wouldnt be commercially viable in a reasonable time frame.

Investor expectations had become inflated in some instances, according to experts. Some very good teams have faced more investor skepticism than I think has been justified This is not six months to mobile application development, Biercuk said.

In Kuperbergs view, part of the problem is that venture capital and quantum computing operate on completely different timelines. Putting venture capital into this in the hope that some profitable thing would arise quickly, that doesnt seem very natural to me in the first place, he said, adding the caveat that he considers the majority of QC money prestige investment rather than strictly ROI-focused.

But some startups themselves may have had some hand in driving financiers over-optimism. I wont name names, but there definitely were some people giving investors outsize expectations, especially when people started coming up with some pieces of hardware, saying that advantages were right around the corner, said Donohe. That very much rubbed the academic community the wrong way.

Scott Aaronson recently called out two prominent startups for what he described as a sort of calculated equivocation. He wrote of a pattern in which a party will speak of a quantum algorithms promise, without asking whether there are any indications that your approach will ever be able to exploit interference of amplitudes to outperform the best classical algorithm.

And, mea culpa, some blame for the hype surely lies with tech media. Trying to crack an area for a lay audience means you inevitably sacrifice some scientific precision, said Biercuk. (Thanks for understanding.)

Its all led to a willingness to serve up a glass of cold water now and again. As Juani Bermejo-Vega, a physicist and researcher at University of Granada in Spain, recently told Wired, the machine on which Google ran its milestone proof of concept is mostly still a useless quantum computer for practical purposes.

Bermejo-Vegas quote came in a story about the emergence of a Twitter account called Quantum Bullshit Detector, which decrees, @artdecider-like, a bullshit or not bullshit quote tweet of various quantum claims. The fact that leading quantum researchers are among the accounts 9,000-plus base of followers would seem to indicate that some weariness exists among the ranks.

But even with the various challenges, cautious optimism seems to characterize much of the industry. For good and ill, Im vocal about maintaining scientific and technical integrity while also being a true optimist about the field and sharing the excitement that I have and to excite others about whats coming, Biercuk said.

This year could prove to be formative in the quest to use quantum computers to solve real-world problems, said Krauthamer. Whenever I talk to people about quantum computing, without fail, they come away really excited. Even the biggest skeptics who say, Oh no, theyre not real. Its not going to happen for a long time.

Related20 Quantum Computing Companies to Know

Read the original post:

What Is Quantum Computing and How Does it Work? - Built In

Have We Solved the Black Hole Information Paradox? – Scientific American

Black holes, some of the most peculiar objects in the universe, pose a paradox for physicists. Two of our best theories give us two differentand seemingly contradictorypictures of how these objects work. Many scientists, including myself, have been trying to reconcile these visions, not just to understand black holes themselves, but also to answer deeper questions, such as What is spacetime? While I and other researchers made some partial progress over the years, the problem persisted. In the past year or so, however, I have developed a framework that I believe elegantly addresses the problem and gives us a glimpse of the mystery of how spacetime emerges at the most fundamental level.

Here is the problem: From the perspective of general relativity, black holes arise if the density of matter becomes too large and gravity collapses the material all the way toward its central point. When this happens, gravity is so strong in this region that nothingeven lightcan escape. The inside of the black hole, therefore, cannot be seen from the outside, even in principle, and the boundary, called the event horizon, acts as a one-way membrane: nothing can go from the interior to the exterior, but there is no problem in falling through it from the exterior to the interior.

But when we consider the effect of quantum mechanics, the theory governing elementary particles, we get another picture. In 1974, Stephen Hawking presented a calculation that made him famous. He discovered that, if we include quantum mechanical effects, a black hole in fact radiates, although very slowly. As a result, it gradually loses its mass and eventually evaporates. This conclusion has been checked by multiple methods now, and its basic validity is beyond doubt. The odd thing, however, is that in Hawkings calculation, the radiation emitted from a black hole does not depend on how the object was created. This means that two black holes created from different initial states can end up with the identical final radiation.

Is this a problem? Yes, it is. Modern physics is built on the assumption that if we have perfect knowledge about a system, then we can predict its future and infer its past by solving the equation of motion. Hawkings result would mean that this basic tenet is incorrect. Many of us thought that this problem was solved in 1997 when Juan Maldacena discovered a new way to view the situation, which seemed to prove no information was lost.

Case closed? Not quite. In 2012, Ahmed Almheiri and collaborators at the University of California, Santa Barbara, presented in their influential paper a strong argument that if the information is preserved in the Hawking emission process, then it is inconsistent with the smoothness of the horizonthe notion that an object can pass through the event horizon without being affected. Given that the option of information loss is out of the question, they argued that the black hole horizon is in fact not a one-way membrane but something like an unbreakable wall, which they called a firewall.

This confused theorists tremendously. As much as they disliked information loss, they abhorred firewalls too. Among other things, the firewall idea implies that Einsteins general relativity is completely wrong, at least at the horizon of a black hole. In fact, this is utterly counterintuitive. For a large black hole, gravity at the horizon is actually very weak because it lies far away from the central point, where all the matter is located. A region near the horizon thus looks pretty much like empty space, and yet the firewall argument says that space must abruptly end at the location of the horizon.

The main thrust of my new work is to realize that there are multiple layers of descriptions of a black hole, and the preservation of information and the smoothness of the horizon refer to theories at different layers. At one level, we can describe a black hole as viewed from a distance: the black hole is formed by collapse of matter, which eventually evaporates leaving the quanta of Hawking radiation in space. From this perspective, Maldacenas insight holds and there is no information loss in the process. That is because, in this picture, an object falling toward the black hole never enters the horizon, not because of a firewall but because of time delay between the clock of the falling object and that of a distant observer. The object seems to be slowly absorbed into the horizon, and its information is later sent back to space in the form of subtle correlations between particles of Hawking radiation.

On the other hand, the picture of the black hole interior emerges when looking at the system from the perspective of someone falling into it. Here we must ignore the fine details of the system that an infalling observer could not see because he or she has only finite time until they hit the singular point at the center of the black hole. This limits the amount of information they can access, even in principle. The world the infalling observer perceives, therefore, is the coarse-grained one. And in this picture, information need not be preserved because we already threw away some information even to arrive at this perspective. This is the way the existence of interior spacetime can be compatible with the preservation of information: they are the properties of the descriptions of nature at different levels!

To understand this concept better, the following analogy might help. Imagine water in a tank and consider a theory describing waves on the surface. At a fundamental level, water consists of a bunch of water molecules, which move, vibrate and collide with each other. With perfect knowledge of their properties, we can describe them deterministically without information loss. This description would be complete, and there would be no need to even introduce the concept of waves. On the other hand, we could focus on the waves by overlooking molecular level details and describing the water as a liquid. The atomic-level information, however, is not preserved in this description. For example, a wave can simply disappear, although the truth is that the coherent motion of water molecules that created the wave was transformed into a more random motion of each molecule without anything disappearing.

This framework tells us that the picture of spacetime offered by general relativity is not as fundamental as we might have thoughtit is merely a picture that emerges at a higher level in the hierarchical descriptions of nature, at least concerning the interior of a black hole. Similar ideas have been discussed earlier in varying forms, but the new framework allows us to explicitly identify the relevant microscopic degrees of freedomin other words, nature's fundamental building blocksparticipating in the emergence of spacetime, which surprisingly involves elements that we normally think to be located far away from the region of interest.

This new way of thinking about the paradox can also be applied to a recent setup devised by Geoff Penington, Stephen H. Shenker, Douglas Stanford and Zhenbin Yang in which Maldacenas scenario is applied more rigorously but in simplified systems. This allows us to identify which features of a realistic black hole are or are not captured by such analyses.

Beginning with the era of Descartes and Galilei, revolutions in physics have often been associated with new understandings of the concept of spacetime, and it seems that we are now in the middle of another such revolution. I strongly suspect that we may soon witness the emergence of a new understanding of nature at a qualitatively different and deeper level.

Read more:

Have We Solved the Black Hole Information Paradox? - Scientific American

Why physicists are determined to prove Galileo and Einstein wrong – Livescience.com

In the 17th century, famed astronomer and physicist Galileo Galilei is said to have climbed to the top of the Tower of Pisa and dropped two different-sized cannonballs. He was trying to demonstrate his theory which Albert Einstein later updated and added to his theory of relativity that objects fall at the same rate regardless of their size.

Now, after spending two years dropping two objects of different mass into a free fall in a satellite, a group of scientists has concluded that Galileo and Einstein were right: The objects fell at a rate that was within two-trillionths of a percent of each other, according to a new study.

This effect has been confirmed time and time again, as has Einstein's theory of relativity yet scientists still aren't convinced that there isn't some kind of exception somewhere. "Scientists have always had a difficult time actually accepting that nature should behave that way," said senior author Peter Wolf, research director at the French National Center for Scientific Research's Paris Observatory.

Related: 8 Ways You Can See Einstein's Theory of Relativity in Real Life

That's because there are still inconsistencies in scientists' understanding of the universe.

"Quantum mechanics and general relativity, which are the two basic theories all of physics is built on today ...are still not unified," Wolf told Live Science. What's more, although scientific theory says the universe is made up mostly of dark matter and dark energy, experiments have failed to detect these mysterious substances.

"So, if we live in a world where there's dark matter around that we can't see, that might have an influence on the motion of [objects]," Wolf said. That influence would be "a very tiny one," but it would be there nonetheless. So, if scientists see test objects fall at different rates, that "might be an indication that we're actually looking at the effect of dark matter," he added.

Wolf and an international group of researchers including scientists from France's National Center for Space Studies and the European Space Agency set out to test Einstein and Galileo's foundational idea that no matter where you do an experiment, no matter how you orient it and what velocity you're moving at through space, the objects will fall at the same rate.

The researchers put two cylindrical objects one made of titanium and the other platinum inside each other and loaded them onto a satellite. The orbiting satellite was naturally "falling" because there were no forces acting on it, Wolf said. They suspended the cylinders within an electromagnetic field and dropped the objects for 100 to 200 hours at a time.

From the forces the researchers needed to apply to keep the cylinders in place inside the satellite, the team deduced how the cylinders fell and the rate at which they fell, Wolf said.

And, sure enough, the team found that the two objects fell at almost exactly the same rate, within two-trillionths of a percent of each other. That suggested Galileo was correct. What's more, they dropped the objects at different times during the two-year experiment and got the same result, suggesting Einstein's theory of relativity was also correct.

Their test was an order of magnitude more sensitive than previous tests. Even so, the researchers have published only 10% of the data from the experiment, and they hope to do further analysis of the rest.

Not satisfied with this mind-boggling level of precision, scientists have put together several new proposals to do similar experiments with two orders of magnitude greater sensitivity, Wolf said. Also, some physicists want to conduct similar experiments at the tiniest scale, with individual atoms of different types, such as rubidium and potassium, he added.

The findings were published Dec. 2 in the journal Physical Review Letters.

Originally published on Live Science.

Read more:

Why physicists are determined to prove Galileo and Einstein wrong - Livescience.com

Fibre on steroids Wits student uses quantum physics for massive improvements – MyBroadband

A research team at Wits University has discovered a way to improve data transmission across fibre networks.

The team comprising of a PhD student at the university, as well as several colleagues from Wits and Huazh University of Science and Technology in Wuhan, China.

This research uses quantum physics to improve data security across fibre networks without the need to replace legacy fibre infrastructure.

Our team showed that multiple patterns of light are accessible through conventional optical fibre that can only support a single pattern, said Wits PhD student, Isaac Nape.

We achieved this quantum trick by engineering the entanglement of two photons. We sent the polarised photon down the fibre line and accessed many other patterns with the other photon.

Entanglement refers to particles interacting in a way that the quantum state of each particle cannot be described without reference to the state of others even if the particles are separated by large distances.

In this scenario, the researchers manipulated the qualities of the photon on the inside of the fibre line by changing the qualities of its entangled counterpart in free space.

In essence, the research introduces the concept of communicating across legacy fibre networks with multi-dimensional entangled states, bringing together the benefits of existing quantum communication with polarised photons with that of high-dimension communication using patterns of light, said team leader Wits Professor Andrew Forbes.

Quantum entanglement has been explored extensively over the past few decades, with the most notable success story being increased communications security through Quantum Key Distribution (QKD).

This method uses qubits 2D quantum states to transfer a limited amount of information across fibre links by using polarisation as a degree of freedom.

Another degree of freedom is the spatial pattern of light, but while this has the benefit of high-dimensional encoding, it requires a custom fibre optical cable making it unsuitable to already existing networks.

Our team found a new way to balance these two extremes, by combining polarisation qubits with high-dimensional spatial modes to create multi-dimensional hybrid quantum states, said Nape.

The trick was to twist the one photon in polarisation and twist the other in pattern, forming spirally light that is entangled in two degrees of freedom, said Forbes.

Since the polarisation entangled photon has only one pattern, it could be sent down the long-distance single-mode fibre, while the twisted light photon could be measured without the fibre, accessing multi-dimensional twisted patterns in the free-space.

These twists carry orbital angular momentum (or spin), a promising candidate for encoding information.

Go here to see the original:

Fibre on steroids Wits student uses quantum physics for massive improvements - MyBroadband

Viewpoint: A New Spin on Thermometers for Extremely Low Temperatures – Physics

January 27, 2020• Physics 13, 7

The temperature of an ultracold gas of rubidium atoms is measured precisely using internal quantum states of a single cesium atom.

Temperature is one of the most widely measured physical quantities. As a notion, it is as old as civilization itself. Yet, scientifically, the meaning and conceptual generality of temperature only fully emerged after intense efforts were made to precisely measure it starting from the 18th century on [1]. That work culminated in the discovery of the absolute temperature scale and revealed the fundamental status temperature has in thermodynamics. Today, temperature measurements, or thermometry, are pushing this foundation to new extremes by probing smaller energies and smaller length scales, where quantum mechanics plays a dominant role. Advances in such measurements have forced a reassessment of basic thermodynamic quantities [2]. They also hold promise for stimulating novel technologies with so-called quantum-enhanced performance [3]. Now, in a new study, Artur Widera from the Technical University of Kaiserslautern, Germany, and colleagues accurately measure the temperature of an ultracold rubidium (Rb) gas using discrete quantized spin states of a cesium (Cs) atom immersed in the gas [4]. This demonstration of quantum-probe thermometry for an ultracold gas promises more accurate measurements of these hard to reach regimes.

Ideally, the temperature of a physical system is measurable without detailed knowledge of the systems inner workings. To achieve that goal, scientists use a probeanother system for which they thoroughly understand the temperature dependence of its physical properties. If the probe is put into contact with the system for a sufficient time, then an energy exchange will occur and cause the probe to equilibrate to the systems temperature. This equilibration allows inference of the systems temperature by the concomitant change in some calibrated property of the probe, such as the column height of a liquid in a capillary tube, the electrical resistance of a conducting element, or the refractive index of a medium.

The frontier of thermometry is thermometer miniaturization with the aim of measuring the difficult-to-access temperatures of very small and cold systems. This goal presents two challenges. First, the probe needs to be much smaller than the system being measured, ensuring that it thermalizes with minimal disturbance to the system. The ultimate minimum size for the probe is a single atom, where information about the systems temperature is mapped onto the atoms quantum state. Second, the characteristic energy scale of the probe needs to be controllable so that it can be tuned to the vicinity of the systems thermal energy, ensuring that the measurement is sensitive. Both these challenges are met by Widera and his co-workers in their new experiment [4].

The teams experimental system consisted of a trapped cloud of just under 10,000 Rb atoms. The atoms were cooled to between 200 and 1000 nK, a regime in which the gas behaves like a classical gas. The temperature of such a gas is accurately determinable by time-of-flight measurements by fitting the velocity distribution of atoms in the cloud imaged after the trap is switched off and the cloud is left to expand for some period of time. The system thus serves as a verifiable testbed for an ultracold thermometer.

For the probe, Widera and colleagues turned to the Cs atom, whose internal atomic structure is well characterized (Fig. 1). Cesium possesses seven accessible ground-state hyperfine energy levels for its outer electron, labeled by an angular momentum projection mCs={3,2,1,0,1,2,3}. Normally these levels all have identical energies. However, applying a weak magnetic field B to the atoms splits the levels into a ladder whose steps have a tuneable energy gap of E2. The Cs atoms thus behave like effective quantum-mechanical spins. Rubidium atoms also possess three accessible states, labeled mRb={1,0,1}, which turn out to have an energy gap of E in the same B field.

To use the Cs atoms to determine the Rb clouds temperature, the team exploited so-called spin-exchange collisions, where quanta of angular momentum are transferred between the Cs and Rb atoms. In one type of collision, known as an endoergic collision, the Rb is pushed into a higher-energy state and the Cs into a lower-energy state (Fig. 2). This process requires E2 of additional energy, which is provided by the motion of the Rb atoms. The occurrence of these collisions depends on the availability of kinetic energyand therefore the temperatureof the Rb cloud. The spread in the distribution of the Cs atoms spin-state populations induced by these collisions thus encodes information about the gas temperature.

The team measured the populations of a handful of Cs atoms after 3 s, by which time the system had reached a steady state. They observed that the steady-state fluctuations in the Cs atoms energies were linearly related to the temperature of the Rb cloud, as independently determined by time-of-flight measurements. The same relationship was found for different applied magnetic fields, different densities of the Rb cloud, and different initial states. This robust result thus convincingly demonstrates that thermometry can be performed using a single-atom quantum probe without the need for detailed model fitting. However, the relatively long time required to reach the steady state is not always accessible. To overcome this problem, Wideras team fitted the experimental data describing the evolution of the Cs probe populations before steady state to a specific microscopic rate model. In this way, they could extract the temperature of the system after just 350 ms of interaction. A theoretical analysis of this approach indicates that only three collisions are needed to obtain a temperature measurement. Furthermore, these measurements are nearly an order of magnitude more sensitive than those performed in the steady state.

This experiment is a fascinating demonstration of a rapid quantum-probe temperature measurement, where the information extracted is maximized and the perturbation to the system is minimized. Future work will undoubtedly exploit the quantumness of the probe beyond spin populations [5] and also utilize universal nonequilibrium properties to avoid the specific model fitting needed here for such measurements [6].

Quantum-probe thermometry has many advantages over conventional time-of-flight measurements, since it is nondestructive, minimally invasive, and spatially localized. The most immediate application is to cold-atom quantum simulations [7], notably strongly interacting fermionic atoms trapped in optical lattices. Such simulation experiments aim to investigate important model systems for which we lack a complete understanding of the physics. This deficiency of knowledge makes it notoriously difficult to directly measure their temperature. Consequently, quantum-probe thermometry will likely be a crucial ingredient for quantum simulations that aim to resolve longstanding questions about these model systems, such as whether they exhibit high-temperature superconductivity [8].

This research is published in Physical Review X.

Stephen R. Clark is a Senior Lecturer in Theoretical Physics at the University of Bristol. He completed his doctoral studies at the University of Oxford in 2007. He has subsequently held research fellowships at the Centre for Quantum Technologies in the National University of Singapore and at Keble College, Oxford, as well as a senior scientist post at the Clarendon Laboratory in the University of Oxford. Before joining Bristol in 2018, he was a Lecturer in physics at the University of Bath. His research focuses on the dynamical properties of driven strongly correlated many-body systems ranging from cold atoms to high-temperature superconductors.

Visit link:

Viewpoint: A New Spin on Thermometers for Extremely Low Temperatures - Physics

Stephen Hawking thought black holes were ‘hairy’. New study suggests he was right. – Big Think

What's it like on the outer edges of a black hole?

This mysterious area, known as the event horizon, is commonly thought of as a point of no return, past which nothing can escape. According to Einstein's theory of general relativity, black holes have smooth, neatly defined event horizons. On the outer side, physical information might be able to escape the black hole's gravitational pull, but once it crosses the event horizon, it's consumed.

"This was scientists' understanding for a long time," Niayesh Afshordi, a physics and astronomy professor at the University of Waterloo, told Daily Galaxy. The American theoretical physicist John Wheeler summed it up by saying: "Black holes have no hair." But then, as Afshordi noted, Stephen Hawking "used quantum mechanics to predict that quantum particles will slowly leak out of black holes, which we now call Hawking radiation."

ESO, ESA/Hubble, M. Kornmesser

In the 1970s, Stephen Hawking famously proposed that black holes aren't truly "black." In simplified terms, the theoretical physicist reasoned that, due to quantum mechanics, black holes actually emit tiny amounts of black-body radiation, and therefore have a non-zero temperature. So, contrary to Einstein's view that black holes are neatly defined and are not surrounded by loose materials, Hawking radiation suggests that black holes are actually surrounded by quantum "fuzz" that consists of particles that escape the gravitational pull.

"If the quantum fuzz responsible for Hawking radiation does exist around black holes, gravitational waves could bounce off of it, which would create smaller gravitational wave signals following the main gravitational collision event, similar to repeating echoes," Afshordi said.

Credit: NASA's Goddard Space Flight Center/Jeremy Schnittman

A new study from Afshordi and co-author Jahed Abedi could provide evidence of these signals, called gravitational wave "echoes." Their analysis examined data collected by the LIGO and Virgo gravitational wave detectors, which in 2015 detected the first direct observation of gravitational waves from the collision of two distant neutron stars. The results, at least according to the researchers' interpretation, showed relatively small "echo" waves following the initial collision event.

"The time delay we expect (and observe) for our echoes ... can only be explained if some quantum structure sits just outside their event horizons," Afshordi told Live Science.

Afshordi et al.

Scientists have long studied black holes in an effort to better understand fundamental physical laws of the universe, especially since the introduction of Hawking radiation. The idea highlighted the extent to which general relativity and quantum mechanics conflict with each other.

Everywhere even in a vacuum, like an event horizon pairs of so-called "virtual particles" briefly pop in and out of existence. One particle in the pair has positive mass, the other negative. Hawking imagined a scenario in which a pair of particles emerged near the event horizon, and the positive particle had just enough energy to escape the black hole, while the negative one fell in.

Over time, this process would lead black holes to evaporate and vanish, given that the particle absorbed had a negative mass. It would also lead to some interesting paradoxes.

For example, quantum mechanics predicts that particles would be able to escape a black hole. This idea suggests that black holes eventually die, which would theoretically mean that the physical information within a black hole also dies. This violates a key idea in quantum mechanics which is that physical information can't be destroyed.

The exact nature of black holes remains a mystery. If confirmed, the recent discovery could help scientists better fuse these two models of the universe. Still, some researchers are skeptical of the recent findings.

"It is not the first claim of this nature coming from this group," Maximiliano Isi, an astrophysicist at MIT, told Live Science. "Unfortunately, other groups have been unable to reproduce their results, and not for lack of trying."

Isi noted that other papers examined the same data, but failed to find echoes. Afshordi told Galaxy Daily:

"Our results are still tentative because there is a very small chance that what we see is due to random noise in the detectors, but this chance becomes less likely as we find more examples. Now that scientists know what we're looking for, we can look for more examples, and have a much more robust confirmation of these signals. Such a confirmation would be the first direct probe of the quantum structure of space-time."

From Your Site Articles

Related Articles Around the Web

Original post:

Stephen Hawking thought black holes were 'hairy'. New study suggests he was right. - Big Think

The Goop Lab’s ‘energy healing’ is no better than placebo, research proves – CNET

A perpetual scene in The Goop Lab, Paltrow and her Chief Content Officer Elise Loehnen sit and talk with people who are well-known in the alternative wellness world.

Imagine that you could get a full release of all your pent-up emotions and relief from all your physical aches and pains, courtesy of a 60-minute session with an energy healer who flaps his hands four to six feet above your body in the name of quantum physics.

This is what goes down in the fifth, and perhaps the most outrageous, episode of Gwyneth Paltrow's The Goop Lab on Netflix. The docuseries features alternative wellness trends often covered on Paltrow's goop.comand is available to stream now on Netflix.

Though designed to "entertain and inform" (as per the disclaimer), the chiropractor turned "somatic energy practitioner" in this episode certainly makes it sound like everyone should give up their primary care provider for an apparent force-field manipulator.

Is there any promise? Is it all quackery? We investigate, but you probably (hopefully) already know the answer.

Energy healer John Amaral waves his hands like magic wands over three Goop employees (and random guest star, dancer Julianne Hough) to whisk away their emotional traumas and physical aches.

Paltrow asks Amaral why he hasn't, until now, shown his practice on-screen. Amaral gives an, uh, interesting response: "It just looks wacky I've been hesitant to show it just because it can look strange. But I think it's time for the world to see." The world sees three Goop-ers and Hough all writhe, wiggle and whimper on the tables. It's as if they're actually being prodded and pulled, without ever being touched.

Hough screams and contorts her body into positions that only a professional dancer could accomplish, and Elise Loehnen, Goop's chief content officer, lets out long, monotone moans that left me mildly uncomfortable.

Only one of the Goop-ers -- Brian, a software architect and self-proclaimed skeptic -- remains relatively still throughout the group treatment. This, to me, strengthens the notion that energy healing is all placebo.

After the fact, Loehnen says the experience felt like an exorcism. Even Paltrow gives a subtle nod to the woo-woo effect of all this, asking Loehner, "Could you get any Goop-ier?"

I would love to know what gets Goop-ier than this.

Energy healing is a type of alternative wellness therapy that involves manipulating the flow of energy in and around your body. One popular form of energy healing, called reiki, aims to remove "blockages" of energy that have built up where physical and emotional pain have occurred.

For example, people who have chronic headaches might have an energy healer work on the supposed energy fields around their head and neck. A runner who's struggled with repetitive stress injuries in the past might have an energy healer focus on the ankles, knees and hips.

Energy healing is (or should be) performed by a trained practitioner. You lie on a table while the practitioner uses their hands to manipulate the energy fields around your body. The practitioner may not touch you at all or may lightly touch certain areas of your body, such as your neck, to feel and reroute energy.

According to Amaral, "If you just change the frequency of vibration of the body itself, it changes the way the cells regrow, it changes the way the sensory system processes." Amaral admits this is just a hypothesis, but the Goop-ers seem to take it as fact nonetheless.

A 2017 review of studies in the Journal of Alternative and Complementary Medicine states that it is currently impossible to conclude whether or not energy healing is effective for any conditions. The current body of research is too limited and much of it is too flawed. A Cochrane review looking specifically at the effects of reiki on anxiety and depression seconds that conclusion.

A 2019 paper in Global Advances in Health and Medicine, however, gives "energy medicine" some credit, saying that while this type of therapy cannot and should not be used singularly, it can offer an additional element of healing for some people and conditions.

The paper notes that "The healing of a patient must include more than the biology and chemistry of their physical body; by necessity, it must include the mental, emotional, and spiritual (energetic) aspects."

I suppose that since chiropractors were once (not too long ago) considered quacks, there is room for open-mindedness. But according to the International Center for Reiki Training, energy healing has been around for at least 100 years -- usually a treatment can be proven or debunked in less time than that, yet many questions still remain about energy healing.

It is worth noting that placebo effects aren't useless: Even Harvard University acknowledges placebos as effective feel-good tools, helping people overcome fatigue, chronic pain and stress.

For example, one study found that a sham version of reiki (performed by an unlicensed person) was just as effective as the real thing in helping chemotherapy patients feel more comfortable. This proved that energy healing was a placebo, but even so, it was helpful for these patients.

Still, placebos can't cure you.

During Reiki or energy healing, the practitioner does not touch you or only does so very briefly and lightly.

According to science writer Dana G. Smith, this episode "is everything that is wrong with Goop," and it looks like other experts agree with her.

Chris Lee, a physicist and writer at Ars Technica, crushes Amaral's allusions to quantum physics and the famed double-slit experiment, saying "Quantum mechanics does not provide you with the mental power to balance energies, find lay lines or cure syphilis. It does, unfortunately, seem to provide buzzwords to those prone to prey on the rich and gullible."

I am far from an expert on quantum physics and the vibrational frequency of body cells (whatever that means), but this episode rubbed me the wrong way, largely because it features a beautiful, successful celebrity partaking in what is currently an utterly unproven therapy.

Julianne Hough is a role model to many women who, after watching Hough writhe and wail on a table, might feel the need to do the same thing. I'm a big fan of Hough, but her part in this episode gave me sleazy celebrity endorsement vibes.

Energy healing, reiki or whatever you want to call it, falls comfortably into the "if this makes you feel better, go ahead" category. Energy healers don't actually touch you, and if they do it's just the graze of a fingertip, so the practice is harmless from a physical standpoint.

Theoretically, there's nothing wrong with seeing an energy healer if you can afford it and it makes you feel good. But the controversy comes from the fact that people who need real, proven psychological or physical treatments might ignore that need in favor of this trendy alternative.

Amaral fails to discuss when conventional medical or psychological treatment is the best option, only putting forth his method as the ultimate healing tactic. Amaral cannot mend a broken bone with his energy, nor can he remedy the neurotransmitter imbalances that cause severe depression.

It can be deadly, even, to ignore conventional treatment and rely on unproven therapies. Research has suggested that cancer patients who reject traditional care are less likely to overcome their illness.

But Amaral can, it seems, produce some level of catharsis: If that's what you need, feel free to lie on the table.

The information contained in this article is for educational and informational purposes only and is not intended as health or medical advice. Always consult a physician or other qualified health provider regarding any questions you may have about a medical condition or health objectives.

Excerpt from:

The Goop Lab's 'energy healing' is no better than placebo, research proves - CNET