The Prometheus League
Breaking News and Updates
- Abolition Of Work
- Ai
- Alt-right
- Alternative Medicine
- Antifa
- Artificial General Intelligence
- Artificial Intelligence
- Artificial Super Intelligence
- Ascension
- Astronomy
- Atheism
- Atheist
- Atlas Shrugged
- Automation
- Ayn Rand
- Bahamas
- Bankruptcy
- Basic Income Guarantee
- Big Tech
- Bitcoin
- Black Lives Matter
- Blackjack
- Boca Chica Texas
- Brexit
- Caribbean
- Casino
- Casino Affiliate
- Cbd Oil
- Censorship
- Cf
- Chess Engines
- Childfree
- Cloning
- Cloud Computing
- Conscious Evolution
- Corona Virus
- Cosmic Heaven
- Covid-19
- Cryonics
- Cryptocurrency
- Cyberpunk
- Darwinism
- Democrat
- Designer Babies
- DNA
- Donald Trump
- Eczema
- Elon Musk
- Entheogens
- Ethical Egoism
- Eugenic Concepts
- Eugenics
- Euthanasia
- Evolution
- Extropian
- Extropianism
- Extropy
- Fake News
- Federalism
- Federalist
- Fifth Amendment
- Fifth Amendment
- Financial Independence
- First Amendment
- Fiscal Freedom
- Food Supplements
- Fourth Amendment
- Fourth Amendment
- Free Speech
- Freedom
- Freedom of Speech
- Futurism
- Futurist
- Gambling
- Gene Medicine
- Genetic Engineering
- Genome
- Germ Warfare
- Golden Rule
- Government Oppression
- Hedonism
- High Seas
- History
- Hubble Telescope
- Human Genetic Engineering
- Human Genetics
- Human Immortality
- Human Longevity
- Illuminati
- Immortality
- Immortality Medicine
- Intentional Communities
- Jacinda Ardern
- Jitsi
- Jordan Peterson
- Las Vegas
- Liberal
- Libertarian
- Libertarianism
- Liberty
- Life Extension
- Macau
- Marie Byrd Land
- Mars
- Mars Colonization
- Mars Colony
- Memetics
- Micronations
- Mind Uploading
- Minerva Reefs
- Modern Satanism
- Moon Colonization
- Nanotech
- National Vanguard
- NATO
- Neo-eugenics
- Neurohacking
- Neurotechnology
- New Utopia
- New Zealand
- Nihilism
- Nootropics
- NSA
- Oceania
- Offshore
- Olympics
- Online Casino
- Online Gambling
- Pantheism
- Personal Empowerment
- Poker
- Political Correctness
- Politically Incorrect
- Polygamy
- Populism
- Post Human
- Post Humanism
- Posthuman
- Posthumanism
- Private Islands
- Progress
- Proud Boys
- Psoriasis
- Psychedelics
- Putin
- Quantum Computing
- Quantum Physics
- Rationalism
- Republican
- Resource Based Economy
- Robotics
- Rockall
- Ron Paul
- Roulette
- Russia
- Sealand
- Seasteading
- Second Amendment
- Second Amendment
- Seychelles
- Singularitarianism
- Singularity
- Socio-economic Collapse
- Space Exploration
- Space Station
- Space Travel
- Spacex
- Sports Betting
- Sportsbook
- Superintelligence
- Survivalism
- Talmud
- Technology
- Teilhard De Charden
- Terraforming Mars
- The Singularity
- Tms
- Tor Browser
- Trance
- Transhuman
- Transhuman News
- Transhumanism
- Transhumanist
- Transtopian
- Transtopianism
- Ukraine
- Uncategorized
- Vaping
- Victimless Crimes
- Virtual Reality
- Wage Slavery
- War On Drugs
- Waveland
- Ww3
- Yahoo
- Zeitgeist Movement
-
Prometheism
-
Forbidden Fruit
-
The Evolutionary Perspective
Monthly Archives: August 2021
Time to play some games? The best virtual reality games for Linux – Linux News – BollyInside
Posted: August 28, 2021 at 11:49 am
Its possible to deep dive into the virtual reality gaming world on your Linux system. Want to explore VR games on Linux? This article takes you through the top 3 VR games available on Linux.
What are VR Games?VR games are the new-gen computer games enabled with virtual reality, in short, VR technology. It gives players a first-person perspective of all the gaming actions. As a participant, you can enjoy the gaming environment through your VR gaming devices, such as hand controllers, VR headsets, sensor-equipped gloves, and others.
Ready to get amazed? Lets start.
VR games are played on gaming consoles, standalone systems, powerful laptops, and PCs compatible with VR headsets including HTC Vive, Oculus Rift, HP Reverb G2, Valve Index, and others.
A lot of VR objects are usable as they are in reality and the gaming developers are making the VR universe more and more immersive with each passing day.
Now, a little brief about VR technology. By now, you know that VR is an abbreviation of Virtual Reality. This is, basically, a computer-generated simulation where the player controls its generated objects through the limb and facial movements in a three-dimensional environment. This environment is interacted with through special equipment, like clothing having touch simulating pressure nodes and enclosed glasses with screens in front, instead of lenses.
How to Get VR Games on LinuxThe Steam store seems to be the best way to get VR games on your system. Good news: you dont need to worry about installing all the modules and software to run the game smoothly. Steam client is ready to take all the worries. So, get a Steam account by downloading the client from Steams site.
Back in 2019, it was reported that VR Linux desktops are around the corner. What about now? Xrdesktop is here for you. Xrdesktop is free to use. It lets you work with the common desktop environments, like GNOME and KDE.
The SimulaVR is a similar open-source project to check out. Top 3 VR Games Available on LinuxNow the fun part: In this section, well share the best 5 VR games to play on Linux in your gaming time.
DOTA 2VR Game Linux DOTA2DOTA 2 by Valve is a multiplayer online battleground and one of the most favorites of video game lovers. Its ruled this generations hearts for the past 8 years. DOTA 2 is a successor of Defence of the Ancients, in short, DOTA. DOTA was a mod created by the gaming community for Warcraft III of Blizzard Entertainment. A fact about DOTA 2: its the very first video game to offer 1 million USD as a prize pool. Thats huge, isnt it? It was about 9 years ago. Now DOTA 2 offers a whopping 31+ million USD to the winners. And thats not a joke. The VR component of this gaming arena is an environment worth discovering. So, get ready with your VR headset for playing DOTA 2 on Linux.
The Talos Principle VRNext up we have The Talos Principle VR. Are you into puzzle games? Then this virtual reality game by Croteam is the perfect one for you. The Talos Principle, actually, is the VR form of a critically acclaimed first-person puzzle based on philosophical science fiction brought by Croteam earlier. Its a real brainstormer as the game makes you think through all its levels in an unconventional way. Its the difficulty of the game that gets you hooked. The narrator brings elements of philosophy that spike more interest. The VR port works well by presenting us, the player, as an unknown element to its world.
LocomancerIf you like crafting and building new designs, Locomancer is to try on your Linux system. This is a toy-box-style simulator created for touch-enabled VR headsets. In this VR video game, you build train models and tracks. You can also test your creation by getting on a train ride. Lots of fun mechanics and some playing are only needed to get familiar with Locomancer. Once experienced enough with the playing, youre good to build your own. This game is somehow relatable to Noddys Toyland cartoon show of the 90s. Anyway, Locomancer is definitely a wise buy for video gamers and is always recommended by the gaming community.
Wrapping UpApart from the above video games, if you want to experience something interesting, do check out Google Earth VR. You enter the place you want to reach out and youre immediately placed there. And from there you can walk to all other places nearby and experience being in numerous cities that exist today. This piece of wander is sure to give your desired relief in your time off.
News Summary:
Read the rest here:
Time to play some games? The best virtual reality games for Linux - Linux News - BollyInside
Posted in Virtual Reality
Comments Off on Time to play some games? The best virtual reality games for Linux – Linux News – BollyInside
Key Applications of Virtual Reality in Medicine – CIO Applications
Posted: at 11:49 am
VR can be used to assist medical professionals in visualizing the interior of the human body, revealing previously inaccessible areas. For example, the dissection of cadavers, which was once required of every new medical student, has given way to the study of human anatomy through virtual reality.
Fremont, CA: Virtual reality (VR) is the name given to the technology that enables a user to use a VR headset to create a situation or experience of interest within an interactive but computer-generated environment. The simulation is interactive, and it may necessitate the use of special 3-D goggles with a screen or gloves that provide sensory feedback to assist the user in learning from experience in this virtual world.
Medical Training
Current medical education has shifted away from rote memorization of facts and toward imparting skills in using facts to arrive at an appropriate management strategy when confronted with a given patient. This training consists of problem-solving skills, communication skills, and VR-based learning.
Virtual reality can be used to simulate any type of medical situation, allowing students to deal with it as if it were real life. This is followed by feedback and debriefing so that they can learn from their mistakes if any exist. Because VR systems are inexpensive and faculty are not required to be present, access is more flexible and broad-based.
VR can be used to assist medical professionals in visualizing the interior of the human body, revealing previously inaccessible areas. For example, the dissection of cadavers, which was once required of every new medical student, has given way to the study of human anatomy through virtual reality.
Treating Patients
Virtual reality is useful in pre-planning complex operations, such as neurosurgical procedures, because it allows the surgical team to walk through the entire procedure and practice their planned intervention.
This increases safety by reducing surprises. CT, MRI, and ultrasound scans are used to compile the data, which is supplemented by VR and haptics. When in surgical settings, the reconstruction looks and feels like the real patient.
Virtual reality is also important in surgical robotics, which is based on a robotic arm controlled by a human surgeon at a console. The surgeon is reliant on the camera embedded within the body to give a view of the area being operated on. However, tactile and sensory feedback is also important during surgery, and VR may provide a reasonable substitute in their absence.
Patients suffering from phobias, for example, find VR extremely beneficial, as do their therapists. To treat clients suffering from agoraphobia or acrophobia, for example, a corresponding situation may be recreated in the therapist's own center to help the client face it gradually. This is also true for post-traumatic stress disorder.
See Also:Top 10 Digital Forensics Solution Companies
Read more:
Key Applications of Virtual Reality in Medicine - CIO Applications
Posted in Virtual Reality
Comments Off on Key Applications of Virtual Reality in Medicine – CIO Applications
Dallas’ MyndVR and Healing HealthCare Systems Offer Virtual Escapes for Seniors and Care Home Residents Dallas Innovates – dallasinnovates.com
Posted: at 11:49 am
C.A.R.E. VRx environment [Image: MyndVR]
Dallas-basedMyndVR, a leading provider of virtual reality solutions for seniors and active agers, has partnered withHealing HealthCare Systemsto offerContinuous AmbientRelaxation Environment VRx(C.A.R.E.) to residents who use the MyndVR system at senior and memory care communities, assisted living centers, hospice care, and in the home.
Using gaze-based navigation with VR headsets, users can interact with the immersive C.A.R.E. VRx app, which blends voice-based, guided imagery with 360-degree visuals of forests, lakes, beaches, and more.
C.A.R.E. VRx environment [Image: MyndVR]
Patients using the app get a chance to meditate, focus breathing, and relax while remaining immersed in the tranquil nature environments.
Studies show that VR experiences like these can help patients heal, lower blood pressure, ward off depression, and more.
[Video still: Courtesy MyndVR]
Were excited to start offering the C.A.R.E. VRx application to our growing network of senior living communities and to families and individuals aging at home, said Chris Brickler, CEO of MyndVR, in a statement.
C.A.R.E. VRx environment [Image: MyndVR]
Were always looking to continue enhancing the lives of seniors, bringing them joy, tranquility, and adventure through virtual reality, Brickler added. This new partnership allows us to expand our content library offerings and provide even more experiences for those who are looking for a brief respite from their reality.
Healing HealthCares CEO cited the pandemics impact on seniors.
Throughout the pandemic, many older adults have remained isolated from loved ones creating feelings of loneliness and depression, said Susan E. Mazer, Ph.D., president and CEO of Healing HealthCare Systems. Studies document the positive impact that virtual reality can have on patients, and even as we emerge from the pandemic the technology will continue to play a pivotal role in the overall health, wellbeing and care of older adults.
C.A.R.E. VRx environment [Image: MyndVR]
To complement the C.A.R.E. VRx experience, MyndVR is also offering an array of aromatherapy oils matching the visual and musical content.
C.A.R.E. VRx environment [Image: MyndVR]
Sign up to keep your eye on whats new and next in Dallas-Fort Worth, every day.
There are plenty of things to do withyourphysically distanced time. Here are a few from our curated selection.
Browse ourcurated selection of contests, nominations, pitches, and grants.Our roundup of programs is for entrepreneurs, corporates, creatives, inventors, educators, and social innovators. Don't miss this: Deloitte's 2021 Technology Fast 500 Awards has been extended through July 9.
These quotable North Texans inspire, inform, motivate, or simply make us laugh. Have wise words of your own? Let us know. You can also sign up here to get "The Last Word" in the Dallas Innovates e-newsletter each weekday. Friday, August 27 "Our historic bottom-line results are direct evidence of the massive move to work-from-home." Chris MacFarland CEO Masergy ...on how his companywhich was acquired by Comcast Business two days agoflourished in its "best year in history." This week, Comcast Business acquired Masergya Plano-based networking services providerto accelerate Comcast Business' effort to serve more large and mid-size companies In a
Former SMU Cox School of Business student Franois Reihaniis on the cover of Entrepreneur magazine's "Young Millionaires" issue; MyndVR offers C.A.R.E. VRx experiences to seniors; EV startup Rivian files with SEC for an IPO; U.S. Postal Service expands its Connect Local business delivery to 115 new DFW sites; and more stories from Dallas Innovates. Plus, youll find our top 10 most popular stories.
With Dallas ranked as one of the top cities in America for tech pros, UT Dallas and Fullstack have launchedfour skills training bootcamps focused on coding, cybersecurity, data analytics, and DevOps. The online bootcamps begin in November with tuition at $11,995 each.
Read more here:
Posted in Virtual Reality
Comments Off on Dallas’ MyndVR and Healing HealthCare Systems Offer Virtual Escapes for Seniors and Care Home Residents Dallas Innovates – dallasinnovates.com
The future of ‘extended reality’ tourism is now, thanks to the pandemic – Frederick News Post
Posted: at 11:49 am
When the coronavirus pandemic shut down travel around the world, some natural, historical and cultural sites saw it as a call to redouble their efforts to embrace extended reality, both to let people tour these destinations from afar and to develop meaningful new ways for travelers to experience them on-site, in hopes of luring them back after the health emergency eased.
Extended reality is the umbrella term for technologies that allow the interaction of physical and virtual worlds, such as augmented reality (AR) and virtual reality (VR). Each has opened up many possibilities for tourism. AR allows tourists at an ancient monument to experience its past glory, for example; VR, on the other hand, allows viewers to visit a historic site or museum remotely. A more intense focus on these technologies, experts say, will be a lasting legacy of the pandemic.
When COVID-19 happened, every destination tried to offer an alternative way of communicating with its tourists, says Suleiman Farajat, chief of the Petra Development and Tourism Region Authority in Jordan. Because Petra is a UNESCO World Heritage site and Jordans most important historical landmark, officials at the destination had already been planning VR-based experiences. Thus, they were able to launch the Xplore Petra app in June 2020.
Made by TimeLooper, a tech company specializing in re-creating historical locations and events, the app gives users an immersive 3D map of Petra at scale, with a birds-eye view of the entire ancient city and its landmarks. At points of interest such as the amphitheater, monastery, royal tombs and treasury users experienced life-size 3D models and panoramic photos. You can enjoy the site, regardless of your location, Farajat says.
Several other creative remote options have launched since the pandemic started:
n The Faroe Islands remote tourism tool, with which users can interact live with a local and use the latter as their eyes and ears to experience the islands.
n A VR reconstruction of the Baalbek ruins in Lebanon, created by Flyover Zone Productions, the German Archaeological Institute and the Lebanese Culture Ministrys Directorate General of Antiquities.
n The luxury tour company &Beyonds virtual safari program livestreams from game reserves in South Africa. You can also book Zoom sessions with rangers.
n A VR-based northern lights tour by the Swedish travel company Lights over Lapland.
The New York-based TimeLooper has VR apps for numerous locations other than Petra, ranging from historical and heritage sites to famous parks and museums, such as the Hiroshima Peace Memorial Park, mining-era Breckenridge in Colorado, the Grand Canyon and a redwood canopy in California.
In the pandemic world, co-founder Andrew Feinberg says, we have worked with our institutional partners to dramatically and quickly scale up their digital presence which for many of these partners was nonexistent. He has witnessed an increasing appreciation for the benefits of a digital presence not only to attract tourists but to enable virtual field trips for schoolchildren, such as the one for Black History Month in February.
He sees great possibility for on-site visits as well: The ability to deploy 25, 30, 35 virtual reality headsets on-site at once for a synchronous experience for the visitor, that was impossible to do five years ago.
Farajats vision is to have VR/AR as one item in a digital buffet served at historical sites, which would also include projectors, holograms and lighting.
Projectors can do wonders, he says.
In fact, he believes projections could rival AR in delivering immersive experiences. An example would be work being done by LithodomosVR, using large-scale 360-degree projections supplemented with VR to bring ancient Rome and ancient Greece to life.
But while the pandemic has sped up the process, we are still probably at the very early stages of adoption of AR and VR, both in terms of the technology itself and its applications at tourist sites, says Ahmed Emara, a digital artist based in Alexandria, Egypt.
Its promising, and its something that doesnt have an equal, but [it] is not ready yet.
The technology needs to grow to a threshold of convenience to be viable, he adds.
For example, at an outdoor site with bright sunlight, mobile and tablet screens will be almost impossible to view, making AR apps useless, while moving crowds will break the AR experience.
In sites like the Valley of the Kings and all the tombs, it can get really crowded, [making it] a logistical hindrance to using this technology, Emara says.
Moreover, AR and VR applications are not socially shareable and are limited to just the person with the device, be it a pair of VR specs or a smartphone.
Another challenge is deploying technology without making local guides redundant.
In a site like Petra, where we heavily rely on tourism, the current situation is not easy because the community has no income, Farajat says. It is very important not to make the tour guides fear that augmented reality or virtual tours, or even applications and audio guides, could take their jobs away.
To prevent that from happening, he hopes that eventually it will be the tour guides themselves who sell AR/VR experiences, so they can profit from it as well.
[If] you ask me where we see this going, TimeLoopers Feinberg says, we believe that the hardware as it continues to evolve is going to ensure that these experiences are more and more accessible.
In the future, he hopes to develop tools that will allow archaeologists, educators and tourist sites to develop their own digital experiences.
That for us is where the next frontier will be.
Continue reading here:
The future of 'extended reality' tourism is now, thanks to the pandemic - Frederick News Post
Posted in Virtual Reality
Comments Off on The future of ‘extended reality’ tourism is now, thanks to the pandemic – Frederick News Post
Expert: Now is the time to prepare for the quantum computing revolution – TechRepublic
Posted: at 11:49 am
Though quantum computing is likely five to 10 years away, waiting until it happens will put your organization behind. Don't play catch-up later.
TechRepublic's Karen Roby spoke with Christopher Savoie, CEO and co-founder of Zapata Computing, a quantum application company, about the future of quantum computing. The following is an edited transcript of their conversation.
SEE: The CIO's guide to quantum computing (free PDF) (TechRepublic)
Christoper Savoie: There are two types of quantum-computing algorithms if you will. There are those that will require what we call a fault-tolerant computing system, one that doesn't have error, for all intents and purposes, that's corrected for error, which is the way most classical computers are now. They don't make errors in their calculations, or at least we hope they don't, not at any significant rate. And eventually we'll have these fault-tolerant quantum computers. People are working on it. We've proven that it can happen already, so that is down the line. But it's in the five- to 10-year range that it's going to take until we have that hardware available. But that's where a lot of the promises for these exponentially faster algorithms. So, these are the algorithms that will use these fault-tolerant computers to basically look at all the options available in a combinatorial matrix.
So, if you have something like Monte Carlo simulation, you can try significantly all the different variables that are possible and look at every possible combination and find the best optimal solution. So, that's really, practically impossible on today's classical computers. You have to choose what variables you're going to use and reduce things and take shortcuts. But with these fault-tolerant computers, for significantly many of the possible solutions in the solution space, we can look at all of the combinations. So, you can imagine almost an infinite amount or an exponential amount of variables that you can try out to see what your best solution is. In things like CCAR [Comprehensive Capital Analysis and Review], Dodd-Frank [Dodd-Frank Wall Street Reform and Consumer Protection Act] compliance, these things where you have to do these complex simulations, we rely on a Monte Carlo simulation.
So, trying all of the possible scenarios. That's not possible today, but this fault tolerance will allow us to try significantly all of the different combinations, which will hopefully give us the ability to predict the future in a much better way, which is important in these financial applications. But we don't have those computers today. They will be available sometime in the future. I hate putting a date on it, but think about it on the decade time horizon. On the other hand, there are these nearer-term algorithms that run on these noisy, so not error-corrected, noisy intermediate-scale quantum devices. We call them NISQ for short. And these are more heuristic types of algorithms that are tolerant to noise, much like neural networks are today in classical computing and [artificial intelligence] AI. You can deal a little bit with the sparse data and maybe some error in the data or other areas of your calculation. Because it's an about-type of calculation like neural networks do. It's not looking at the exact answers, all of them and figuring out which one is definitely the best. This is an approximate algorithm that iterates and tries to get closer and closer to the right answer.
SEE: Hiring Kit: Video Game Designer (TechRepublic Premium)
But we know that neural networks work this way, deep neural networks. AI, in its current state, uses this type of algorithm, these heuristics. Most of what we do in computation nowadays and finance is heuristic in its nature and statistical in its nature, and it works good enough to do some really good work. In algorithmic trading, in risk analysis, this is what we use today. And these quantum versions of that will also be able to give us some advantage and maybe an advantage overwe've been able to show in recent workthe purely classical version of that. So, we'll have some quantum-augmented AI, quantum-augmented [machine learning] ML. We call it a quantum-enhanced ML or quantum-enhanced optimization that we'll be able to do.
So, people think of this as a dichotomy. We have these NISQ machines, and they're faulty, and then one day we'll wake up and we'll have this fault tolerance, but it's really not that way. These faulty algorithms, if you will, these heuristics that are about, they will still work and they may work better than the fault-tolerant algorithms for some problems and some datasets, so this really is a gradient. It really is. You'd have a false sense of solace, maybe two. "Oh well, if that's 10 years down the road we can just wait and let's wait till we wake up and have fault tolerance." But really the algorithms are going to be progressing. And the things that we develop now will still be useful in that fault-tolerant regime. And the patents will all be good for the stuff that we do now.
So, thinking that, "OK, this is a 10 year time horizon for those fault-tolerant computers. Our organization is just going to wait." Well, if you do, you get a couple of things. You're not going to have the workforce in place to be able to take advantage of this. You're probably not going to have the infrastructure in place to be able to take advantage of this. And meanwhile, all of your competitors and their vendors have acquired a portfolio of patents on these methodologies that are good for 20 years. So, if you wait five years from now and there's a patent four years down the line, that's good for 24 years. So there really is, I think, an incentive for organizations to really start working, even in this NISQ, this noisier regime that we're in today.
Karen Roby: You get a little false sense of security, as you mentioned, of something, oh, you say that's 10 years down the line, but really with this, you don't have the luxury of catching up if you wait too long. This is something that people need to be focused on now for what is down the road.
SEE: Quantum entanglement-as-a-service: "The key technology" for unbreakable networks (TechRepublic)
Christoper Savoie: Yes, absolutely. And in finance, if you have a better ability to detect risks then than your competitors; you're at a huge advantage to be able to find alpha in the market. If you can do that better than others, you're going to be at a huge advantage. And if you're blocked by people's patents or blocked by the fact that your workforce doesn't know how to use these things, you're really behind the eight ball. And we've seen this time and time again with different technology evolutions and revolutions. With big data and our use of big data, with that infrastructure, with AI and machine learning. The organizations that have waited generally have found themselves behind the eight ball, and it's really hard to catch up because this stuff is changing daily, weekly, and new inventions are happening. And if you don't have a workforce that's up and running and an infrastructure ready to accept this, it's really hard to catch up with your competitors.
Karen Roby: You've touched on this a little bit, but really for the finance industry, this can be transformative, really significant what quantum computing can do.
Christoper Savoie: Absolutely. At the end of the day, finance is math, and we can do better math and more accurate math on large datasets with quantum computing. There is no question about that. It's no longer an "if." Google has, with their experiment, proven that at some point we're going to have a machine that is definitely going to be better at doing math, some types of math, than classical computers. With that premise, if you're in a field that depends on math, that depends on numbers, which is everything, and statistics, which is finance, no matter what side you're on. If you're on the risk side or the investing side, you're going to need to have the best tools. And that doesn't mean you have to be an algorithmic trader necessarily, but even looking at tail risk and creating portfolios and this kind of thing. You're dependent on being able to quickly ascertain what that risk is, and computing is the only way to do that.
SEE: The quantum decade: IBM predicts the 2020s will see quantum begin to solve real problems (TechRepublic)
And on the regulatory side, I mentioned CCAR. I think as these capabilities emerge, it allows the regulators to ask for even more scenarios to be simulated, those things that are a big headache for a lot of companies. But it's important because our global financial system depends on stability and predictability, and to be able to have a computational resource like quantum that's going to allow us to see more variables or more possibilities or more disaster scenarios. It can really help. "What is the effect of, say, a COVID-type event on the global financial system?" To be more predictive of that and more accurate at doing that is good for everybody. I think all boats rise, and quantum is definitely going to give us that advantage as well.
Karen Roby: Most definitely. And Christopher, before I let you go, if you would just give us a quick snapshot of Zapata Computing and the work that you guys do.
Christoper Savoie: We have two really important components to try and make this stuff reality. On the one hand, we've got over 30 of the brightest young minds and algorithms, particularly for these near-term devices and how to write those. We've written some of the fundamental algorithms that are out there to be used on quantum computers. On the other hand, how do you make those things work? That's a software engineering thing. That's not really quantum science. How do you make the big data work? And that's all the boring stuff of ETL and data transformation and digitalization and cloud and multicloud and all this boring but very important stuff. So basically Zapata is a company that has the best of the algorithms, but also best-of-breed means of actually software engineering that in a modern, multicloud environment that particularly finance companies, banks, they're regulated companies with a lot of data that is sensitive and private and proprietary. So, you need to be able to work in a safe and secure multicloud environment, and that's what our software engineering side allows us to do. We have the best of both worlds there.
Learn the latest news and best practices about data science, big data analytics, and artificial intelligence. Delivered Mondays
Image: sakkmesterke, Getty Images/iStockphoto
Continue reading here:
Expert: Now is the time to prepare for the quantum computing revolution - TechRepublic
Posted in Quantum Computing
Comments Off on Expert: Now is the time to prepare for the quantum computing revolution – TechRepublic
Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet
Posted: at 11:49 am
The trial successfully demonstrated, according to Verizon, that it is possible to replace current security processes with protocols that are quantum-proof.
To protect our private communications from future attacks by quantum computers, Verizon is trialing the use of next-generation cryptography keys to protect the virtual private networks (VPNs) that are used every day by companies around the world to prevent hacking.
Verizon implemented what it describes as a "quantum-safe" VPN between one of the company's labs in London in the UK and a US-based center in Ashburn, Virginia, using encryption keys that were generated thanks to post-quantum cryptography methods meaning that they are robust enough to withstand attacks from a quantum computer.
According to Verizon, the trial successfully demonstrated that it is possible to replace current security processes with protocols that are quantum-proof.
VPNs are a common security tool used to protect connections made over the internet, by creating a private network from a public internet connection. When a user browses the web with a VPN, all of their data is redirected through a specifically configured remote server run by the VPN host, which acts as a filter that encrypts the information.
This means that the user's IP address and any of their online activities, from sending emails to paying bills, come out as gibberish to potential hackers even on insecure networks like public WiFi, where eavesdropping is much easier.
Especially in the last few months, which have seen many employees switching to full-time working from home,VPNs have become an increasingly popular tool to ensure privacy and security on the internet.
The technology, however, is based on cryptography protocols that are not un-hackable. To encrypt data, VPN hosts use encryption keys that are generated by well-established algorithms such as RSA (RivestShamirAdleman). The difficulty of cracking the key, and therefore of reading the data, is directly linked to the algorithm's ability to create as complicated a key as possible.
In other words, encryption protocols as we know them are essentially a huge math problem for hackers to solve. With existing computers, cracking the equation is extremely difficult, which is why VPNs, for now, are still a secure solution. But quantum computers are expected to bring about huge amounts of extra computing power and with that, the ability to hack any cryptography key in minutes.
"A lot of secure communications rely on algorithms which have been very successful in offering secure cryptography keys for decades," Venkata Josyula, the director of technology at Verizon, tells ZDNet. "But there is enough research out there saying that these can be broken when there is a quantum computer available at a certain capacity. When that is available, you want to be protecting your entire VPN infrastructure."
One approach that researchers are working on consists ofdeveloping algorithms that can generate keys that are too difficult to hack, even with a quantum computer. This area of research is known as post-quantum cryptography, and is particularly sought after by governments around the world.
In the US, for example, the National Institute of Standards and Technology (NIST) launched a global research effort in 2016 calling on researchers to submit ideas for algorithms that would be less susceptible to a quantum attack. A few months ago, the organization selected a group of 15 algorithms that showed the most promise.
"NIST is leading a standardization process, but we didn't want to wait for that to be complete because getting cryptography to change across the globe is a pretty daunting task," says Josyula. "It could take 10 or even 20 years, so we wanted to get into this early to figure out the implications."
Verizon has significant amounts of VPN infrastructure and the company sells VPN products, which is why the team started investigating how to start enabling post-quantum cryptography right now and in existing services, Josyula adds.
One of the 15 algorithms identified by NIST, called Saber, was selected for the test. Saber generated quantum-safe cryptography keys that were delivered to the endpoints in London and Ashburn of a typical IPsec VPN through an extra layer of infrastructure, which was provided by a third-party vendor.
Whether Saber makes it to the final rounds of NIST's standardization process, in this case, doesn't matter, explains Josyula. "We tried Saber here, but we will be trying others. We are able to switch from one algorithm to the other. We want to have that flexibility, to be able to adapt in line with the process of standardization."
In other words, Verizon's test has shown that it is possible to implement post-quantum cryptography candidates on infrastructure links now, with the ability to migrate as needed between different candidates for quantum-proof algorithms.
This is important because, although a large-scale quantum computer could be more than a decade away, there is still a chance that the data that is currently encrypted with existing cryptography protocols is at risk.
The threat is known as "harvest now, decrypt later" and refers to the possibility that hackers could collect huge amounts of encrypted data and sit on it while they wait for a quantum computer to come along that could read all the information.
"If it's your Amazon shopping cart, you may not care if someone gets to see it in ten years," says Josyula. "But you can extend this to your bank account, personal number, and all the way to government secrets. It's about how far into the future you see value for the data that you own and some of these have very long lifetimes."
For this type of data, it is important to start thinking about long-term security now, which includes the risk posed by quantum computers.
A quantum-safe VPN could be a good start even though, as Josyula explains, many elements still need to be smoothed out. For example, Verizon still relied on standard mechanisms in its trial to deliver quantum-proof keys to the VPN end-points. This might be a sticking point, if it turns out that this phase of the process is not invulnerable to quantum attack.
The idea, however, is to take proactive steps to prepare, instead of waiting for the worst-case scenario to happen. Connecting London to Ashburn was a first step, and Verizon is now looking at extending its quantum-safe VPN to other locations.
Read more from the original source:
Posted in Quantum Computing
Comments Off on Quantum computers could read all your encrypted data. This ‘quantum-safe’ VPN aims to stop that – ZDNet
Who will dominate the tech arms race? – The Jerusalem Post
Posted: at 11:49 am
It is almost impossible to overstate what a quantum computer will be able to do, Christopher Monroe told the Magazine in a recent interview.
Monroe a professor at both the University of Maryland and Duke University, as well as co-founder of the quantum computing company IonQ discussed how quantum computing will change the face of the planet, even if this might take some more time.
The Magazine also interviewed four other experts in the quantum field and visited seven of their labs at the University of Maryland.
cnxps.cmd.push(function () { cnxps({ playerId: '36af7c51-0caf-4741-9824-2c941fc6c17b' }).render('4c4d856e0e6f4e3d808bbc1715e132f6'); });
These labs the full likes of which do not yet exist in Israel hosted all kinds of qubits (the basis of quantum computers), lasers blasting targets to cause plasma to come off to form distinctive films, infrared lasers, furnaces reaching 2,000C, a tetra arc furnace for growing silicon crystals, special dilution refrigerators to achieve cryostorage (deep freezing) and a variety of vacuum chambers that would seem like an alternate reality to the uninitiated.
Before entering each lab, there needed to be a conversation about whether this reporter should be wearing the special goggles that were handed out to avoid getting blinded.
One top quantum official at Maryland, Prof. Dr. Johnpierre Paglione, assured the Magazine that the ultrahazardous materials warning on many of the lab doors was not a concern at that moment.
From cracking the Internet as we know it, to military and economic dominance, to changing the way people manage their lives, quantum computers are predicted to make mincemeat of todays supercomputers. Put simply, they are made out of and operate from a completely different kind of material and set of principles connected to qubits and quantum mechanics, with computing potential that dwarfs classical computers capabilities.
But lets say the US wins the race who in the US would win it? Would it be giants like Google, Microsoft, Amazon, IBM and Honeywell? Or might it be a lean and fast solely quantum-focused challenger like Monroes IonQ?
At first glance, Google has no real challenger. In 2019, Google said it achieved quantum supremacy when its quantum computer became the first to perform a calculation that would be practically impossible for a classical machine, by checking the outputs from a quantum random-number generator.
The search-engine giant has already built a 54-qubit computer whereas IonQs largest quantum computer only has 32 qubits. Google has also promised to achieve the holy grail of quantum computing, a system large enough to revolutionize the Internet, military and economic issues, by 2029. Although China recently reproduced Googles experiment, Google is still regarded as ahead of the game.
Why is a 32-qubit quantum computer better than a 54-qubit one?
So why is Monroe so confident that his company will finish the race long before Google?
First, he takes a shot at the Google 2019 experiment.
It was a fairly academic exercise. The problem they attacked was one of those rare problems where you can prove something and you can prove the super computer cannot do it. Quantum mechanics works. It is not a surprise. The problem Google tackled was utterly useless. The system was not flexible enough to program to hit other problems. So a big company did a big academic demonstration, he said with a sort of whoop-dee-do tone and expression on his face.
Google had to repeat its experiment millions of times The signal went down by orders of magnitude. There are special issues to get the data. There are general problems where it cannot maintain [coherence]. The Google experiment and qubits decayed by seven times the constant. We gauge on one time for the constant and we can do 100 operations, with IonQs quantum computers.
In radioactive decay, the time constant is related to the decay constant and essentially represents the average lifetime of a decaying system, such as an atom. Some of the tactics for potentially overcoming decay go back to the lasers, vacuum chambers and cryostorage refrigerators mentioned above.
Monroe said from a business perspective, the experiment was a big distraction, and you will hear this from Google computer employees. They had to run simulations to prove how hard it would be to do what they were doing with old computers instead of building better quantum computers and solving useful algorithms.
We believe quantum computers work now it is time to build them, he stressed.
Describing IonQs quantum computers, Monroe said, The 32-qubit computer is fifth generation. The third and fourth generation is available to [clients of] Microsoft, Amazon and Google Cloud. It is 11 qubits, which is admittedly small, but it still runs more than any IBM machine can run. An 11-qubit computer is very clean operationally. It can run 100 or so ops [operations] before the laser noise causes coherence to be lost [before the qubits stop working]. That is many more ops [operations] than superconductors. If [a computer] has one million qubits, but can only run a few ops [operations], it is boring. But trapped ions adding more qubits at the same time makes things cheaper.
He added, The 32-qubit computer is not yet on the cloud. We are working in private with customers financials, noting that a future publication will discuss the baby version of an algorithm which could be very interesting when you start to scale it up. Maybe in the next generation, we can engineer it to solve an optimization problem something we dont get from the cloud, where we dont get any telemetry, which would be an unusual benefit for clients.
According to Monroe, that he will be able to build a 1,000-qubit computer by 2025 practically tomorrow in the sphere of new inventions will in and of itself be game-changing. This is true even if it is not yet capable of accomplishing all the extreme miracles that much larger quantum computers may someday accomplish.
A major innovation or risk (depending on your worldview) by Monroe is how he treats the paramount challenge of quantum computers and error correction basically the idea that for quantum computers to work, some process must be conceived to prevent qubits from decaying at the rate they currently decay at otherwise crucial calculations get interrupted mid-calculation.
Here, Monroe critiques both the Google approach and responds to criticism from some of his academic colleagues about his approach to error correction. Google is trying to get to one million qubits that do not work well together.
In contrast, a special encoding process could allow IonQ to create what Monroe called a single sort of super qubit, which would eliminate 99.9% of native errors. This is the easiest way to get better at quantum computing, as opposed to the quantity over quality path Google is pursuing.
But he has to defend himself from others poking holes in his approach as unrealistic, including some of his colleagues at University of Maryland (all sides still express great respect for each other). Confronted by this criticism, he responded that their path of attack was based on the theory of error correction. It implies that you will do indefinitely long computations, [but] no one will ever need this high a standard to do business.
We do not use error correction on our CPU [central processing unit] because silicon is so stable. We call it OK if it fails in one year, since that is more than enough time to be economically worthwhile. Instead of trying to eliminate errors, his strategy is to gradually add more qubits, which achieves slightly more substantial results. His goal is to work around the error-correction problem.
Part of the difference between Monroe and his academic colleagues relates to his having crossed over into a mix of business and academia. Monroes view on this issue? Industry and academia do not always see things the same way. Academics are trained to prove everything we do. But if a computer works better to solve a certain problem, we do not need to prove it.
For example, if a quantum computer doubled the value of a financial portfolio compared to a super computers financial recommendations, the client is thrilled even if no one knows how.
He said that when shortcuts solve problems and certain things cannot be proven but where quantum computing finds value academics hate it. They are trained to be pessimists. I do believe quantum computers will find narrow applications within five years.
Besides error correction, another question is what the qubits themselves, the basis of different kinds of quantum computers, should be made out of. The technique that many of his competitors are using to make computers out of a particular kind of qubit has the benefit of being not hard to do, inexpensive and representing beautiful physics.
However, he warned, No one knows where to find it if it exists So stay in solid-state physics and build computers out of solid-state systems. Google, Amazon and others are all invested in solid-state computers. But I dont see it happening without fundamental physics breakthroughs. If you want to build and engineer a device if you want to have a business you should not be reliant on physics breakthroughs.
Instead of the path of his competitors, Monroe emphasized working with natural quantum atoms and tricking and engineering them to act how he wants using low pressure instead of low temperatures.
I work with charged atoms or ions. We levitate them inside a vacuum chamber which is getting smaller every year. We have a silicon chip. Just electrodes, electric force fields are holding up these atoms. There are no solids and no air in the vacuum chamber, which means the atoms remain extremely well isolated. They are the most perfect atoms we know, so we can scale without worrying about the top of the noise [the threshold where qubits decay]. We can pick qubit levels that do not yet decay.
Why are Google and IBM investing in natural qubits? Because they have a blind spot. They have been first in solid-state physics and engineering for 50 years. If there is a silicon solid-state quantum computer, Intel will make that, but I dont see how it will be scaled, he declared.
MONROE IS far from the full quantum show at Maryland.
Paglione has been a professor at University of Maryland for 13 years and the director of the Maryland Quantum Materials Center for the last five years.
In 1986, the center was working on high-temperature superconductors, Paglione said, noting that work on quantum computers is a more recent development. The development has not merely altered the focus of the centers research. According to Paglione, it has also helped grow the center from around seven staff members 30 years ago to around 100 staff members when all of the affiliate members, students and administrative staff are taken into account.
Similarly, Dr. Gretchen Campbell, director of the Joint Quantum Institute, told the Magazine that a big part of her institutions role and her personal role has been to first bring together people from atomic physics and condensed-matter physics even within physics, we do not always talk to each other, followed by connecting these experts with computer science experts.
Campbell explained it was crucial to explore the interaction between the quantum realm and quantum algorithms, for which they needed more math and computer science backgrounds and to continue to move from laboratories to real-world applications to translating into technology and interacting more with industry.
She also guided the Magazine, adorning goggles, through a lab with a digital micromirror device and laser beams relating to atom clouds and light projectors.
Add in some additional departments at Maryland as well as a partnership with the National Institute of Standards and Technology (NIST) and the number of staff swells way past 100. What are their many different teams working on? The lab studies and experiments are as varied as the different disciplines, with Paglione talking about possibilities for making squid devices or sensitive magnetic sensors that could be constructed by using a superconducting quantum interference device.
Paglione said magnetometer systems could be used with squids to sense the magnetic field of samples. These could be used as detectors in water. If they were made sensitive enough, they could sense changes in a magnetic field, such as when a submarine passes by and generates a changed magnetic field.
This has drawn attention from the US Department of Defense.
A multidisciplinary mix of Pagliones team recently captured the most direct evidence to date of a quantum quirk, which permits particles to tunnel through a barrier as if it is not even there. The upshot could be assisting engineers in designing more uniform components to build both future quantum computers and quantum sensors (reported applications could detect not only submarines but aircraft).
Pagliones team, headed by Ichiro Takeuchi, a professor of materials science and engineering at Maryland, successfully carried out a new experiment in which they observed Klein tunneling. In the quantum world, tunneling enables particles, such as electrons, to pass through a barrier even if they lack sufficient energy to actually climb over it. A taller barrier usually makes climbing over harder and fewer particles are able to cross through. The phenomenon, known as Klein tunneling, happens when the barrier becomes completely transparent and opens up a portal that particles can traverse regardless of the barriers height.
Scientists and engineers from Marylands Center for Nanophysics and Advanced Materials, the Joint Quantum Institute and the Condensed Matter Theory Center along with the Department of Materials Science and Engineering and Department of Physics, succeeded in making the most compelling measurements of the phenomenon to date.
Given that Klein tunneling was initially predicted to occur in the world of high-energy quantum particles moving close to the speed of light, observing the effect was viewed as impossible. That was until scientists revealed that some of the rules governing fast-moving quantum particles can also apply to the comparatively sluggish particles traveling near the surface of some highly unusual materials.
It was a piece of serendipity that the unusual material and an elemental relative of sorts shared the same crystal structure, said Paglione. However, the multidisciplinary team we have was one of the keys to this success. Having experts on topological physics, thin-film synthesis, spectroscopy and theoretical understanding really got us to this point.
Bringing this back to quantum computing, the idea is that interactions between superconductors and other materials are central ingredients in some quantum computer architectures and precision-sensing devices. Yet, there has always been a problem that the junction, or crossover spot, where they interact is slightly different. Takeuchi said this led to sucking up countless amounts of time and energy tuning and calibrating to reach the best performance.
Takeuchi said Klein tunneling could eliminate this variability, which has played havoc with device-to-device interactions.
AN ENTIRELY separate quantum application could be physics department chairman Prof. Steve Rolstons work on establishing a quantum communications network. Rolston explained that when a pair of photons are quantum entangled you can achieve quantum encryption over a communications network, by using entangled particles to create secure keys that cannot be hacked. There are varying paths to achieve such a quantum network and Rolston is skeptical of others in the field who could be seen as cutting corners.
He also is underwhelmed by Chinas achievements in this area. According to Rolston, no one has figured out how to extend a secure quantum network over any space sizable enough to make the network usable and marketable in practical terms.
Rather, he said existing quantum networks are either limited to very small spaces, or to extend their range they must employ gimmicks that usually impair how secure they are. Because of these limitations, Rolston went as far as to say that his view is that the US National Security Agency views the issue as a distraction.
In terms of export trade barriers or issues with China, he said he opposes controls and believes cooperation in the quantum realm should continue, especially since all of his centers research is made public anyway.
Rolston also lives up to Monroes framing of the difference between academics and industry-focused people. He said that even Monroe would have to admit that no one is close to the true holy grail of quantum computers computers with a massive number of qubits and that the IonQ founder is banking on interesting optimization problems being solvable for industry to an extent which will justify the hype instead.
In contrast, Rolston remained pessimistic that such smaller quantum computers would achieve sufficient superiority at optimization issues in business to justify a rushed prediction that transforming the world is just around the corner.
In Rolstons view, the longer, more patient and steadier path is the one that will eventually reap rewards.
For the moment, we do not know whether Google or IonQ, or those like Monroe or Rolston will eventually be able to declare they were right. We do know that whoever is right and whoever is first will radically change the world as we know it.
View post:
Posted in Quantum Computing
Comments Off on Who will dominate the tech arms race? – The Jerusalem Post
IBM partners with the University of Tokyo on quantum computer – Illinoisnewstoday.com
Posted: at 11:49 am
Tokyo IBM and the University of Tokyo have announced one of Japans most powerful quantum computers.
According to IBM, IBM Quantum System One is part of the Japan-IBM quantum partnership between the University of Tokyo and IBM, advancing Japans quest for quantum science, business and education.
IBM Quantum System One is currently in operation for researchers at both Japanese scientific institutions and companies, and access is controlled by the University of Tokyo.
IBM is committed to growing the global quantum ecosystem and facilitating collaboration between different research communities, said Dr. Dario Gil, director of IBM Research.
According to IBM, quantum computers combine quantum resources with classical processing to provide users with access to reproducible and predictable performance from high-quality qubits and precision control electronics. Users can safely execute algorithms that require iterative quantum circuits in the cloud.
see next: IBM partners with Atos on contract with Dutch Ministry of Defense
IBM Quantum System One in Japan is IBMs second system built outside the United States. In June, IBM unveiled the IBM Quantum System One, managed by the scientific research institute Fraunhofer Geselleschaft, in Munich, Germany.
IBMs commitment to quantum is aimed at advancing quantum computing and fostering a skilled quantum workforce around the world.
We are thrilled to see Japans contributions to research by world-class academics, the private sector, and government agencies, Gil said.
Together, we can take a big step towards accelerating scientific progress in different areas, Gil said.
Teruo Fujii, President of the University of Tokyo, said, In the field of rapidly changing quantum technology, it is very important not only to develop elements and systems related to quantum technology, but also to develop the next generation of human resources. To achieve a high degree of social implementation.
Our university has a wide range of research capabilities and has always promoted high-level quantum education from the undergraduate level. Now, with IBM Quantum System One, we will develop the next generation of quantum native skill sets. Further refine it.
In 2020, IBM and the University of Tokyo Quantum Innovation Initiative Consortium (QIIC) aims to strategically accelerate the research and development activities of quantum computing in Japan by bringing together the academic talents of universities, research groups and industries nationwide.
Last year, IBM also announced partnerships with several organizations focusing on quantum information science and technology. Cleveland Clinic, NS Science and Technology Facilities Council in the United Kingdom, And that University of Illinois at Urbana-Champaign..
see next: Public cloud computing provider
See original here:
IBM partners with the University of Tokyo on quantum computer - Illinoisnewstoday.com
Posted in Quantum Computing
Comments Off on IBM partners with the University of Tokyo on quantum computer – Illinoisnewstoday.com
Why Quantum Resistance Is the Next Blockchain Frontier – Tech Times
Posted: at 11:49 am
(Photo : Why Quantum Resistance Is the Next Blockchain Frontier)
As decentralized networks secured by potentially thousands of miners and/or nodes, blockchains are widely considered to be an incredibly secure example of distributed ledger technology.
On the back of this, they also have dozens of potential applications - ranging from decentralized content storage networks, to medical records databases, and supply chain management. But to this day, they're most commonly thought of as the ideal platform hosting the financial infrastructure of tomorrow - such as decentralized exchanges and payment settlement networks.
But there's a problem. While the blockchains of today are practically unhackable - due to the type of encryption they use to secure private keys and transactions - this might not be the case for much longer. This is due to the advent of so-called "quantum computers", that is, computers that can leverage the properties of quantum mechanics to solve problems that would be impossible with traditional computers... such as breaking the cryptography that secures current generation blockchains.
Many blockchains of today use at least two types of cryptographic algorithms - asymmetric key algorithms and hash functions.
The first kind, also known as public-key cryptography, is used to produce pairs of private and public keys that are provably cryptographically linked. In Bitcoin, this private key is used to spend UTXOs - thereby transferring value from one person to another. The second kind - the hash function - is used to securely process raw transaction data into a block in a way that is practically irreversible.
As you might imagine, a sufficiently powerful quantum computer capable of breaking either of these security mechanisms could have devastating consequences for susceptible blockchains - since they could be used to potentially derive private keys or even mine cryptocurrency units much faster than the expected rate (leading to supply inflation).
So, just how far away from this are we? Well, according to recent estimates, a quantum computer possessing 4,000 qubits of processing power could be the minimum necessary to break the public key cryptography that secures Bitcoin user funds. A sufficiently flexible quantum computer with this processing power could, theoretically, take over the funds contained in any Bitcoin p2pk address - that's a total of around 2 million BTC (circa $67 billion at today's rates).
Fortunately, this isn't an immediate concern. As it stands, the world's most powerful quantum computer - the Zuchongzhi quantum computer- currently clocks in at an impressive (albeit insufficient) 66 qubits. However, given the rapid pace of development in the quantum computing sector, some experts predict that Bitcoin's Elliptic Curve Digital Signature Algorithm (ECDSA) could meet its quantum match within a decade.
(Photo : The Next Platform)
The algorithm that could be potentially used to break ECDSA has already been developed. If generalized and applied by a powerful enough quantum computer, it is widely thought that Peter Shor's polynomial time quantum algorithm would be able to attack the Bitcoin blockchain - while similar algorithms could be applied to other forms of traditional encryption.
But this might not be a concern for much longer, thanks to the introduction of what many consider to be the world's first truly quantum-resistant blockchain. The platform, known as QANplatform, is built to resist all known quantum attacks by using lattice cryptography. QAN manages to achieve quantum resistance while simultaneously tackling the energy concerns that come with some other blockchains through its highly efficient consensus mechanism known as Proof-of-Randomness (PoR).
Unlike some other so-called quantum-resistant blockchains, QAN is unusual in that it also supports decentralized applications (DApps) - allowing developers to launch quantum-resistant DApps within minutes using its free developer tools.
Besides platforms like QAN, the development communities behind several popular blockchains are already beginning to consider implementing their own quantum-resistance solutions, such as the recently elaboratedcommit-delay-reveal scheme - which could be used to transition Bitcoin to a quantum-resistant state. Nonetheless, the future of post-quantum cryptography still remains up in the air, as none of the top ten blockchains by user count have yet committed to a specific quantum-resistant signature scheme.
2021 TECHTIMES.com All rights reserved. Do not reproduce without permission.
Tags:
See more here:
Why Quantum Resistance Is the Next Blockchain Frontier - Tech Times
Posted in Quantum Computing
Comments Off on Why Quantum Resistance Is the Next Blockchain Frontier – Tech Times
Life, the universe and everything Physics seeks the future – The Economist
Posted: at 11:49 am
Aug 25th 2021
A WISE PROVERB suggests not putting all your eggs in one basket. Over recent decades, however, physicists have failed to follow that wisdom. The 20th centuryand, indeed, the 19th before itwere periods of triumph for them. They transformed understanding of the material universe and thus peoples ability to manipulate the world around them. Modernity could not exist without the knowledge won by physicists over those two centuries.
Your browser does not support the
Get The Economist app and play articles, wherever you are
In exchange, the world has given them expensive toys to play with. The most recent of these, the Large Hadron Collider (LHC), which occupies a 27km-circumference tunnel near Geneva and cost $6bn, opened for business in 2008. It quickly found a long-predicted elementary particle, the Higgs boson, that was a hangover from calculations done in the 1960s. It then embarked on its real purpose, to search for a phenomenon called Supersymmetry.
This theory, devised in the 1970s and known as Susy for short, is the all-containing basket into which particle physicss eggs have until recently been placed. Of itself, it would eliminate many arbitrary mathematical assumptions needed for the proper working of what is known as the Standard Model of particle physics. But it is also the vanguard of a deeper hypothesis, string theory, which is intended to synthesise the Standard Model with Einsteins general theory of relativity. Einsteins theory explains gravity. The Standard Model explains the other three fundamental forceselectromagnetism and the weak and strong nuclear forcesand their associated particles. Both describe their particular provinces of reality well. But they do not connect together. String theory would connect them, and thus provide a so-called theory of everything.
String theory proposes that the universe is composed of minuscule objects which vibrate in the manner of the strings of a musical instrument. Like such strings, they have resonant frequencies and harmonics. These various vibrational modes, string theorists contend, correspond to various fundamental particles. Such particles include all of those already observed as part of the Standard Model, the further particles predicted by Susy, which posits that the Standard Models mathematical fragility will go away if each of that models particles has a heavier supersymmetric partner particle, or sparticle, and also particles called gravitons, which are needed to tie the force of gravity into any unified theory, but are not predicted by relativity.
But, no Susy, no string theory. And, 13 years after the LHC opened, no sparticles have shown up. Even two as-yet-unexplained results announced earlier this year (one from the LHC and one from a smaller machine) offer no evidence directly supporting Susy. Many physicists thus worry they have been on a wild-goose chase.
They have good reason to be nervous. String theory already comes with a disturbing conceptual price tagthat of adding six (or in one version seven) extra dimensions to the universe, over and above the four familiar ones (three of space and one of time). It also describes about 10500 possible universes, only one of which matches the universe in which human beings live. Accepting all that is challenging enough. Without Susy, though, string theory goes bananas. The number of dimensions balloons to 26. The theory also loses the ability to describe most of the Standard Models particles. And it implies the existence of weird stuff such as particles called tachyons that move faster than light and are thus incompatible with the theory of relativity. Without Susy, string theory thus looks pretty-much dead as a theory of everything. Which, if true, clears the field for non-string theories of everything.
The names of many of these do, it must be conceded, torture the English language. They include causal dynamical triangulation, asymptotically safe gravity, loop quantum gravity and the amplituhedron formulation of quantum theory. But at the moment the bookies favourite for unifying relativity and the Standard Model is something called entropic gravity.
Entropy is a measure of a systems disorder. Famously, the second law of thermodynamics asserts that it increases with time (ie, things have a tendency to get messier as they get older). What that has to do with a theory of gravity, let alone of everything, is not, perhaps, immediately obvious. But the link is black holes. These are objects which have such strong gravitational fields that even light cannot escape from them. They are predicted by the mathematics of general relativity. And even though Einstein remained sceptical about their actual existence until the day he died in 1955, subsequent observations have shown that they are indeed real. But they are not black.
In 1974 Stephen Hawking, of Cambridge University, showed that quantum effects at a black holes boundary allow it to radiate particlesespecially photons, which are the particles of electromagnetic radiation, including light. This has peculiar consequences. Photons carry radiant heat, so something which emits them has a temperature. And, from its temperature and mass, it is possible to calculate a black holes entropy. This matters because, when all these variables are plugged into the first law of thermodynamics, which states that energy can be neither created nor destroyed, only transformed from one form (say, heat) into another (say, mechanical work), what pops out are Einsteins equations of general relativity.
That relationship was discovered in 2010 by Erik Verlinde of Amsterdam University. It has serious implications. The laws of thermodynamics rely on statistical mechanics. They involve properties (temperature, entropy and so on) which emerge from probabilistic descriptions of the behaviour of the underlying particles involved. These are also the particles described by quantum mechanics, the mathematical theory which underpins the Standard Model. That Einsteins equations can be rewritten thermodynamically implies that space and time are also emergent properties of this deeper microscopic picture. The existing forms of quantum mechanics and relativity thus do indeed both seem derivable in principle from some deeper theory that describes the underlying fabric of the universe.
String theory is not so derivable. Strings are not fundamental enough entities. But entropic gravity claims to describe the very nature of space and timeor, to use Einsteinian terminology, spacetime. It asserts this is woven from filaments of quantum entanglement linking every particle in the cosmos.
The idea of quantum entanglement, another phenomenon pooh-poohed by Einstein that turned out to be true, goes back to 1935. It is that the properties of two or more objects can be correlated (entangled) in a way which means they cannot be described independently. This leads to weird effects. In particular, it means that two entangled particles can appear to influence each others behaviour instantaneously even when they are far apart. Einstein dubbed this spooky action at a distance, because it seems to violate the premise of relativity theory that, in the speed of light, the universe has a speed limit.
As with black holes, Einstein did not live long enough to see himself proved wrong. Experiments have nevertheless shown he was. Entanglement is real, and does not violate relativity because although the influence of one particle on another can be instantaneous there is no way to use the effect to pass information faster than light-speed. And, in the past five years, Brian Swingle of Harvard University and Sean Carroll of the California Institute of Technology have begun building models of what Dr Verlindes ideas might mean in practice, using ideas from quantum information theory. Their approach employs bits of quantum information (so-called qubits) to stand in for the entangled particles. The result is a simple but informative analogue of spacetime.
Qubits, the quantum equivalent of classical bitsthe ones and zeros on which regular computing is builtwill be familiar to those who follow the field of quantum computing. They are the basis of quantum information theory. Two properties distinguish qubits from the regular sort. First, they can be placed in a state of superposition, representing both a one and a zero at the same time. Second, several qubits can become entangled. Together, these properties let quantum computers accomplish feats such as performing multiple calculations at once, or completing certain classes of calculation in a sensible amount of time, that are difficult or impossible for a regular computer.
And because of their entanglement qubits can also, according to Dr Swingle and Dr Carroll, be used as stand-ins for how reality works. More closely entangled qubits represent particles at points in spacetime that are closer together. So far, quantum computers being a work in progress, this modelling can be done only with mathematical representations of qubits. These do, though, seem to obey the equations of general relativity. That supports entropic-gravity-theorys claims.
All of this modelling puts entropic gravity in pole position to replace strings as the long-sought theory of everything. But the idea that spacetime is an emergent property of the universe rather than being fundamental to it has a disturbing consequence. It blurs the nature of causality.
In the picture built by entropic gravity, spacetime is a superposition of multiple states. It is this which muddies causality. The branch of maths that best describes spacetime is a form of geometry that has four axes at right angles to each other instead of the more familiar three. The fourth represents time, so, like the position of objects, the order of events in spacetime is determined geometrically. If different geometric arrangements are superposed, as entropic gravity requires, it can therefore sometimes happen that the statements A causes B and B causes A are both true.
This is not mere speculation. In 2016 Giulia Rubino of the University of Bristol, in England, constructed an experiment involving polarised photons and prisms which achieved exactly that. This spells trouble for those who have old-fashioned notions about causalitys nature.
However, Lucien Hardy of the Perimeter Institute, in Canada, has discovered a way to reformulate the laws of quantum mechanics to get around this. In his view, causality as commonly perceived is like data compression in computing: it is a concept that gives you more bang for your buck. With a little bit of information about the present, causality can infer a lot about the futurecompressing the amount of information needed to capture the details of a physical system in time.
But causality, Dr Hardy thinks, may not be the only way to describe such correlations. Instead, he has invented a general method for building descriptions of the patterns in correlations from scratch. This method, which he calls the causaloid framework, tends to reproduce causality but it does not assume it, and he has used it to reformulate both quantum theory (in 2005) and general relativity (in 2016). Causaloid maths is not a theory of everything. But there is a good chance that if and when such a theory is found, causaloid principles will be needed to describe it, just as general relativity needed a geometry of four dimensions to describe spacetime.
Entropic gravity has, then, a lot of heavy-duty conceptual work to back it up. But it is not the only candidate to replace string theory. Others jostling for attention include an old competitor called loop quantum gravity, originally proposed in 1994 by Carlo Rovelli, then at the University of Pittsburgh, and Lee Smolin, of the Perimeter Institute. This, and causal dynamical triangulation, a more recent but similar idea, suggest that spacetime is not the smooth fabric asserted by general relativity, but, rather, has a structureeither elementary loops or triangles, according to which of the two theories you support.
A third option, asymptotically safe gravity, goes back still further, to 1976. It was suggested by Steven Weinberg, one of the Standard Models chief architects. A natural way to develop a theory of quantum gravity is to add gravitons to the model. Unfortunately, this approach got nowhere, because when the interactions of these putative particles were calculated at higher energies, the maths seemed to become nonsensical. However, Weinberg, who died in July, argued that this apparent breakdown would go away (in maths speak, the calculations would be asymptotically safe) if sufficiently powerful machines were used to do the calculating. And, with the recent advent of supercomputers of such power, it looks, from early results, as if he might have been right.
One of the most intriguing competitors of entropic gravity, though, is the amplituhedron formulation of quantum theory. This was introduced in 2013 by Nima Arkani-Hamed of the Institute of Advanced Study at Princeton and Jaroslav Trnka of the University of California, Davis. They have found a class of geometric structures dubbed amplituhedrons, each of which encodes the details of a possible quantum interaction. These, in turn, are facets of a master amplituhedron that encodes every possible type of physical process. It is thus possible to reformulate all of quantum theory in terms of the amplituhedron.
Most attempts at a theory of everything try to fit gravity, which Einstein describes geometrically, into quantum theory, which does not rely on geometry in this way. The amplituhedron approach does the opposite, by suggesting that quantum theory is actually deeply geometric after all. Better yet, the amplituhedron is not founded on notions of spacetime, or even statistical mechanics. Instead, these ideas emerge naturally from it. So, while the amplituhedron approach does not as yet offer a full theory of quantum gravity, it has opened up an intriguing path that may lead to one.
That space, time and even causality are emergent rather than fundamental properties of the cosmos are radical ideas. But this is the point. General relativity and quantum mechanics, the physics revolutions of the 20th century, were viewed as profound precisely because they overthrew common sense. To accept relativity meant abandoning a universal notion of time and space. To take quantum mechanics seriously meant getting comfortable with ideas like entanglement and superposition. Embracing entropic gravity or its alternatives will require similar feats of the imagination.
No theory, though, is worth a damn without data. That, after all, is the problem with Supersymmetry. Work like Dr Rubinos points the way. But something out of a particle-physics laboratory would also be welcome. And, though their meaning is obscure, the past few months have indeed seen two experimentally induced cracks in the Standard Model.
On March 23rd a team from CERN, the organisation that runs the LHC, reported an unexpected difference in behaviour between electrons and their heavier cousins, muons. These particles differ from one another in no known properties but their masses, so the Standard Model predicts that when other particles decay into them, the two should each be produced in equal numbers. But this appears not to be true. Interim results from the LHC suggest that a type of particle called a B-meson is more likely to decay into an electron than a muon. That suggests an as-yet-undescribed fundamental force is missing from the Standard Model. Then, on April 7th, Fermilab, Americas biggest particle-physics facility, announced the interim results of its own muon experiment, Muon g-2.
In the quantum world, there is no such thing as a perfect vacuum. Instead, a froth of particles constantly pops in and out of existence everywhere in spacetime. These are virtual rather than real particlesthat is, they are transient fluctuations which emerge straight out of quantum uncertainty. But, although they are short-lived, during the brief periods of their existence they still have time to interact with more permanent sorts of matter. They are, for example, the source of the black-hole radiation predicted by Hawking.
The strengths of their interactions with types of matter more conventional than black holes are predicted by the Standard Model, and to test these predictions, Muon g-2 shoots muons in circles around a powerful superconducting magnetic-storage ring. The quantum froth changes the way the muons wobble, which detectors can pick up with incredible precision. The Muon g-2 experiment suggests that the interactions causing these wobbles are slightly stronger than the Standard Model predicts. If confirmed, this would mean the model is missing one or more elementary particles.
There is a slim chance that these are the absent sparticles. If so, it is the supporters of supersymmetry who will have the last laugh. But nothing points in this direction and, having failed thus far to stand their ideas up, they are keeping sensibly quiet.
Whatever the causes of these two results, they do show that there is something out there which established explanations cannot account for. Similarly unexplained anomalies were starting points for both quantum theory and relativity. It looks possible, therefore, that what has seemed one of physicss darkest periods is about to brighten into a new morning.
This article appeared in the Science & technology section of the print edition under the headline "Bye, bye, little Susy"
See the rest here:
Life, the universe and everything Physics seeks the future - The Economist
Posted in Quantum Computing
Comments Off on Life, the universe and everything Physics seeks the future – The Economist







