Page 41«..1020..40414243..5060..»

Category Archives: Quantum Computing

Atos Talks European HPC Openness and a Hybrid Future of AI, Machine Learning and Quantum Supercomputing – insideHPC

Posted: May 15, 2022 at 9:58 pm

This exclusive Q&A interview was conducted by Nages Sieslack of the ISC 2022 conference organization, with Eric Eppe, head of portfolio and solutions, HPC & Quantum at France-based HPC systems vendor Atos.

Nages Sieslack: How are the needs of the European HPC market changing with regard to traditional supercomputing and things like deep learning/AI and data-centric computing?

Eric Eppe: Supercomputers are the soft power for all nationals. They are essential for numerical simulations, accelerating technological, industrial, and scientific innovations. We see a transition from traditional compute-centric simulation toward data-centric, resulting in a more heterogeneous workload. We believe the future of HPC is hybrid. This means combining traditional simulation workflows using CPU & GPU (even TPU, FPGA, IPU, why not QPU) with advanced techniques to accelerate part of these workflows thanks to machine learning, artificial intelligence (AI), or even quantum computing (QC). The virtue of deep learning/AI is not only limited to the GPU accelerator on the hardware side but also serves as the foundation of smart software within HPC cluster management and workload optimization. In this regard, deep learning/AI-empowered software optimizes workloads while increasing the systems global efficiency.

Sieslack: How are those changes affecting the types of systems and services you are offering?

Eppe: Atos leads the hybrid computing trend with its existing HPC portfolio and the newly revealed BullSequana XH3000. It is the next-generation hybrid computing platform, the foundation for any scale simulation up to the exascale. It has unparalleled flexibility, industry-leading density, and embedded security. For Atos, Exascale doesnt mean Exaflopic performance only. We believe that Increasing global system and application efficiency is the only way to decrease system cost and stay within a reasonable power consumption at that scale. Thus, we have incorporated ML/AI mechanisms in our HPC Software suites to optimize simulation and keep control of energy consumption for unprecedented efficiency. We have also witnessed the need for high-performance AI simulations and launched the ThinkAI solution last year. With ThinkAI, we eliminate all the roadblocks in designing, developing, and installing high-performance AI systems, putting AI simulations at the fingertips of all businesses and academics. Furthermore, we leverage our HPC-as-a-Service portfolio, enabling any customer to run their simulations anywhere they want.

Sieslack: Geographically, where do see your biggest opportunities for growth in the HPC market, both within Europe and globally?

Eppe: Compared with China, the U.S., and Japan, which are relatively closed HPC economies (they build their own HPCs for their use), Europe is the most dynamic and open HPC market. Europe has significantly invested in the EuroHPC JU, Atos empowers 5 of the 7 EuroHPC centers. Europe continues to invest in supercomputing, including HPC and Quantum computing, e.g.the upcoming Exascale tenders, as an extension of the EuroHPC JU. Atos designs, develops, and builds our HPCs in Angers, France, and is number 1 in HPC in Europe. We also have our HPC, AI & QC R&D centers in France. We actively participate in European initiatives to develop the European microprocessor with EPI and the GAIA-X initiative. Atos is the undisputed leader in the European HPC market, instrumental to its technological and economic sovereignty.

Sieslack: What trends are you seeing for your HPC on-demand service via your Nimbix cloud offering with regards to use cases and the types of customers?

Eric Eppe of Atos

Eppe: As industry analysts have predicted, cloud computing will continue to grow at double-digit rates through 2025*. Our on-demand service through Nimbix is seeing this growth, with customers across the globe consuming computer power at record numbers. We have witnessed on-demand usage increase specifically within automotive manufacturing, Lifesciences, and academic research organizations. We are pleased to offer these industries the most comprehensive hybrid HPC cloud portfolio and are excited to be advancing this space with new offerings and technology. In fact, in the second half of the year, we will deploy the first public cloud offering in partnership with a top hyperscaler to provide Genomic analytics of sequencing data, from specialized cluster resources delivered by Atos Nimbix.

Sieslack: What is Atos doing now on the quantum computing front? Which companies and partners are currently using your Quantum Learning Machine simulator?

Eppe: Quantum Computing will re-invent how we simulate, co-existing with HPC. In Dec 2021, Atos confirmed its role as a global leader in quantum hybridization technologies at its 8th Quantum Advisory Board. For Atos, we work mainly on five strategic directions to accelerate quantum computing:

On top of these five strategic paths, we have launched Qscore, a universal metric to benchmark quantum applications performance. Together with clients worldwide, such as Argonne Labs, BMW, CESGA, SENAI CIMATEC, and Total, we are accelerating the arrival of the quantum era.

*Source: Intersect360 Research forecasts cloud computing will continue to grow at double-digit rates through 2025.

View original post here:

Atos Talks European HPC Openness and a Hybrid Future of AI, Machine Learning and Quantum Supercomputing - insideHPC

Posted in Quantum Computing | Comments Off on Atos Talks European HPC Openness and a Hybrid Future of AI, Machine Learning and Quantum Supercomputing – insideHPC

Colocation consolidation: Analysts look at what’s driving the feeding frenzy – The Register

Posted: at 9:58 pm

Analysis Colocation facilities aren't just a place to drop a couple of servers anymore. Many are quickly becoming full-fledged infrastructure-as-a-service providers as they embrace new consumption-based models and place a stronger emphasis on networking and edge connectivity.

But supporting the growing menagerie of value-added services takes a substantial footprint and an even larger customer base, a dynamic that's driven a wave of consolidation throughout the industry, analysts from Forrester Research and Gartner told The Register.

"You can only provide those value-added services if you're big enough," Forrester research director Glenn O'Donnell said.

The past few months have seen this trend play out en masse, with the latest being private equity firm DigitalBridge Investment Management's take over of datacenter provider Switch Inc in a deal valued at $11 billion.

Switch operates datacenters specializing in high-performance infrastructure. The company completed its fifth Prime datacenter campus in Texas last year, but this is only the latest colo acquisition in recent memory.

"There have been a pile of smaller colocation providers that have been coming together, either being acquired by the big boys, or they've been merging," O'Donnell said.

There's been a flurry of colocation mergers and acquisitions over the past few months. Here's just a sampling: NorthC acquired Netrics, LightEdge bought NFinit, EdgeConnex made off with GTN, Unitas Global snapped up INAP, VPLS nabbed a Carrier-1 datacenter in Texas, and Digital 9 absorbed Finnish colo Ficolo and Volta's London datacenters.

So what's driving this ramp in M&A activity? You might think it's the cloud, and while there's certainly some truth to that, O'Donnell says it's not the full story.

"I always like to remind people that just because cloud is so big and growing does not mean the datacenter is dead," he said, adding that to some extent cloud has actually driven people to colos more than it has hurt them.

"I won't give cloud all of the credit, but cloud certainly proved that this is a viable way of doing things," O'Donnell added.

What the cloud has managed to do is force colocation providers to innovate around new consumption models and platform services, while simultaneously expanding their reach closer to the edge.

The major cloud providers operate a relatively small number of extremely large datacenters located in key metros around the world. By contrast, colocation providers like Equinix and Digital Realty operate hundreds of datacenters around the globe.

This reach is not only one of the big attractions of colocation providers, Gartner analyst Matthew Brisse said, but it also turns out to be one of the biggest drivers of M&A activity.

"Size matters in this business because customers, especially multinational customers, want datacenters in a lot of different places," O'Donnell said.

According to Brisse, when enterprises start looking into colocation facilities, their main concern is getting workloads spun up in the right place. "The main reason that people go to colos, is location, location, location," he said.

And this demand has only accelerated as colocation providers look to offer services closer to the edge.

"We see the colocation providers starting to build out their edge offering as opposed to a simple hoteling experience for your infrastructure," Brisse said.

These aren't necessarily large datacenter facilities in the traditional sense, either, he explained. These can be as small as a half-sized shipping container positioned at the base of a cell tower.

Smaller regional colocation providers also serve an important role because they tend to build in places the larger players overlook, Brisse explained.

"A lot of companies don't have the luxury of sitting right next to an Equinix facility," he said. "There's lots of opportunities out there for colocation market in totality."

And as colocation providers inch closer to the edge, Brisse argues networking and automation are only becoming more important.

One of the most potent value adds offered by major colocation providers today is networking.

"As you look at the colocation services, the networking services have become a pretty big deal to differentiate them from just being a simple chunk of real estate to plop your servers," O'Donnell said.

And here again the larger players have the advantage. "Networking connectivity requires a big provider with lots of locations connected by their own fiber," he added.

These backbone networks allow workloads running in a datacenter on one side of the country to communicate with another without ever going out over the open internet.

But it's not just networking between colocation datacenters that's important. Many of these colocation facilities are located directly adjacent to the major cloud and software-as-a-service providers.

"So AWS, for example, or Microsoft Azure might be in the same building as you and connecting to it is just a matter of connecting to a different cage in that same building," O'Donnell said. "Smaller players can't do that, but the bigger guys can."

However, as customers increasingly turn to colocation providers for edge compute and networking, complexity rears its ugly head, Brisse argues.

In the future, "we're going to have lots of datacenters everywhere; we're going to have lots of data distributed in the right location; we're going to have edge facilities everywhere bringing data close to the edge," he said. "It is not going to be possible for humans to monitor all of that activity."

So, in addition to growing their footprint and network services, Brisse believes colos will also need to invest in AI operations capabilities to manage this complexity.

Both Brisse and O'Donnell expect the colocation market to continue to consolidate as macroeconomic forces put a pressure on smaller players.

"If the economic troubles we're seeing are persistent, I think we will see an acceleration of this kind of [M&A] activity," O'Donnell said.

It's important to remember that while colos may look like tech companies on the inside, on the books, they're really real estate investment trusts, he said, adding that in the current economic environment, colos are a comparatively safe bet in an otherwise dismal commercial real estate market.

"Colo is a hot market and getting hotter," O'Donnell said.

Read the original here:

Colocation consolidation: Analysts look at what's driving the feeding frenzy - The Register

Posted in Quantum Computing | Comments Off on Colocation consolidation: Analysts look at what’s driving the feeding frenzy – The Register

Will Bitcoin be killed by quantum computing? – Investment Monitor

Posted: May 3, 2022 at 10:30 pm

Quantum computers will eventually break much of todays encryption, and that includes the signing algorithm of Bitcoin and other cryptocurrencies. Approximately one-quarter of the Bitcoin ($168bn) in circulation in 2022 is vulnerable to quantum attack, according to a study by Deloitte.

Cybersecurity specialist Itan Barmes led the vulnerability study of the Bitcoin blockchain. He found the level of exposure that a large enough quantum computer would have on the Bitcoin blockchain presents a systemic risk. If [4 million] coins are eventually stolen in this way, then trust in the system will be lost and the value of Bitcoin will probably go to zero, he says.

Todays cryptocurrency market is valued at approximately $3trn and Bitcoin reached an all-time high of more than $65,000 per coin in 2021, making crypto the best-performing asset class of the past ten years, according to Geminis Global State of Crypto report for 2022. However, Bitcoins bumpy journey into mainstream investor portfolios coincides with major advances in quantum computing.

Most encryption relies on the relationship between public and private keys, which is called asymmetric cryptography. Quantum-vulnerable Bitcoins include those created before 2010 when public keys had not been hashed into a different and safer format. Also at risk are Bitcoin addresses that have been already used once and have therefore become visible on the blockchain. There are four million Bitcoin addresses that could in theory be hacked by a quantum computer large enough to derive the corresponding private key to unlock and transfer the value to another address. This is known as a storage attack.

The second kind of attack a transit attack attacks Bitcoin transactions in transit. In contrast to the storage attacks, where only a subset of addresses is vulnerable, all transactions are vulnerable.

In January 2022, a team at Sussex University spin-out company Universal Quantum published research on transit attacks, which calculated that it would require a quantum computer with a 1.9 billion qubit-capacity to break Bitcoins encryption in the required ten-minute window (this is the time taken for a Bitcoin to be mined). Even at 317 million qubits it would take an hour and 13 million qubits for a day. For context, IBMs superconducting quantum computer currently has a 127-qubit processor.

Cybersecurity is top of mind for those within the quantum community, but many industry insiders, including Barmes, believe there is not enough communication between the quantum computing community and the Bitcoin community to ensure future cybersecurity on the Bitcoin blockchain. There are a lot of statements made from either community which indicates a lack of understanding of the other side, he says.

Barmes believes that as long as cryptocurrencies migrate on time (to post-quantum cryptography) then everything should be fine. It is not too late to migrate, but such a migration takes time, so waiting until the last moment might turn out to be too late, he says. The exact moment when it becomes too late is, of course, unknown.

The blockchain presents a unique challenge for quantum-safe cryptography because of its decentralised nature and the complications in governance structures that this poses. Achieving this consensus is extremely difficult, so the governance issues are possibly equal to the complexities of the technical problems agreement takes much more time than people think, says Barmes. While not enough is being done on technical solutions, too little attention is also given to governance issues, he adds.

Barmes is advocating awareness of the issues as the first stage in addressing the problem. Then, very technical people need to come up with published and demonstratable solutions, not just speculation, he adds.

For investors without a technical background, quantum security is a difficult topic to evaluate. Cryptocurrency projects should be more transparent about their plans to mitigate quantum risk, says Barmes. That will give investors the information they need in order to make decisions. The hope is that this transparency could encourage a more robust mitigation strategy.

While more mainstream investors may not be aware of the potential security issues arising from quantum computing advances on Bitcoin, Miko Matsumura, general partner at San Francisco-based Cryptos Capital, says most knowledgeable investors have priced in the risk of quantum cybersecurity breaches. He is not concerned about quantum computing risk because attackers have two ways to breach Bitcoin, neither one of which presents a catastrophe for the blockchain.

You could attack Bitcoins signing mechanism, which would create havoc during an attack, but the attack would be very visible, adds Matsumura. If such attackswere to take place, Satoshi [Bitcoins architect] had a plan, which was simply to hard fork Bitcoin (a complete protocol change leading to divergence from the original) and replace the signing mechanism.

On the point of consensus, Matsumura is much more buoyant than Barmes. Satoshi already wrote about what to do in case the signing algorithm was penetrated, so it is likely that the community would just agree to do what Satoshi proposed, he says.

On this more positive note, Duncan Jones, head of cybersecurity at Cambridge Quantum, says the conversation about risk needs to be more focused on how quantum technologies can enhance digital asset security. The focus is often on the threat from quantum computers, and yet blockchains face complex and sophisticated threats every day, he says. We can strengthen blockchains against some of these risks if we integrate quantum technology into the core of these systems.

This is a view reiterated by Charles Hayter, CEO and co-founder of CryptoCompare, who believes quantum computing cyber risk is not on the radar of the cryptocurrency investment community. The optimistic view is that quantum-safe cryptocurrency will solve the problems that arise and that is the reason that the community is not worried, he says. It is considered by many in the industry as like having to replace the engine on your car there is a solution.

Cryptography has always been a race against hackers and there have always been solutions along the way, says Hayter. As for quantum cybersecurity mitigation strategies on cryptocurrency exchanges, he believes it is far too early for quantum computing to be an issue.

Transitioning to post-quantum algorithms and conversations between the Bitcoin community and the quantum computing community will be key to mitigating the cybersecurity risk to cryptocurrency investment. As always, timelines around quantum computing appear to be vague, but nevertheless the time has come for Bitcoin investors to take note.

The rest is here:

Will Bitcoin be killed by quantum computing? - Investment Monitor

Posted in Quantum Computing | Comments Off on Will Bitcoin be killed by quantum computing? – Investment Monitor

IONQ Stock Falls 10% Following Scorpion Capital Short Report. 11 Things to Know. – InvestorPlace

Posted: at 10:30 pm

IonQ (NYSE:IONQ) stock is down about 7% today after Scorpion Capital released an 183-page short report on the company. IonQ operates as a quantum-computing company, claiming to have the worlds most powerful quantum computer. Last year, it also became publicly traded through a special purpose acquisition (SPAC) transaction.

Source: Amin Van / Shutterstock.com

However, through interviews with industry experts and former employees, Scorpion believes the companys claims of a 32-qubit machine are a brazen hoax. The firm points out that past employees have stated the technology doesnt exist and that it was totally made up.

Scorpion even goes as far as comparing IonQs quantum computer with Nikolas (NASDAQ:NKLA) rolling truck catastrophe. The firm characterizes the quantum computer as IonQs claim to fame as well as the basis of its SPAC.

Shares of IONQ stock are down over 50% year-to-date (YTD). So, with that in mind, lets get into the details of the short report.

On the date of publication, Eddie Pan did not hold (either directly or indirectly) any positions in the securities mentioned in this article. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

Article printed from InvestorPlace Media, https://investorplace.com/2022/05/ionq-stock-falls-10-following-scorpion-capital-short-report-11-things-to-know/.

2022 InvestorPlace Media, LLC

Read the original:

IONQ Stock Falls 10% Following Scorpion Capital Short Report. 11 Things to Know. - InvestorPlace

Posted in Quantum Computing | Comments Off on IONQ Stock Falls 10% Following Scorpion Capital Short Report. 11 Things to Know. – InvestorPlace

Quantum Alliance Initiative urges US government to collaborate on universal quantum computer R&D – TelecomTV

Posted: at 10:30 pm

Currently, the US government is already planning for, and devoting resources to, systems that will enable and protect encrypted data, both strategic, tactical and commercial, against decryption by quantum computers belonging to enemy states, even though the devices do not as yet exist (allegedly). The National Institute of Standards and Technology (NIST), a part of the US Commerce Department, is one body already working on the problem of devising methodologies, technologies and standards that will guarantee encoded messages cannot be decoded by even the most powerful quantum computers.

In parallel with the development of quantum-resistant encryption, research is also underway into having quantum systems developed and in place now to decrypt intercepted and stored messages and data obtained from enemy states as soon quantum computers become a reality. To that end, scientists at NIST have announced that a compendium of encryption algorithms that, it is believed, are able to withstand decoding by quantum computers, is to be released to interest parties, organisations, teams and individual scientists within the next few weeks.

They will be tested to destruction, amended and re-iterated, and estimates are it could take two years before final, full-tested versions of the algorithms are released and five to ten years before they will be put to use in a quantum computing environment. The worry is, of course, that just because encrypted data cannot be decrypted on a US quantum computer, it doesnt necessarily mean the same will bet true when that data is run though a quantum computer belonging to an enemy state.

Meanwhile, politicians are urging the legislature to act now to ensure that government encryption systems and methodologies are brought up to state-of-the-art levels of sophistication as soon as possible. Thus, a new bipartisan bill sponsored by both Democrats and Republicans will require the government to adopt quantum-resistant encryption as soon as the necessary standards are available.

However, the US government relies heavily on easily available, and relatively cheap, commercial software for a lot of its systems and even if a root-and-branch change to extremely expensive bespoke systems is mandated, it will take many years to accomplish and government agencies would still be beholden to the private computing, IT and telecoms industry help them make what would be a very difficult transition. Thats why another plank in the US governments quantum computing strategy will be to work very closely with allies, including Australia, Canada and the UK, which are themselves already well along the way to delivering a quantum computer.

As the QAI prospectus states, a quantum computer can indeed pose a threat to national security as it exists right now, but quantum cybersecurity can provide a solution. Thats because it will usher in an era of a nearly unhackable cyberspace through a layered approach of implementing quantum random numbers, quantum resistant algorithms, and quantum communication networks. Through a concerted effort to develop and implement quantum cybersecurity solutions, we can secure todays most sensitive data from both current hackers and future quantum-enabled hackers, as well as protect vital infrastructure from the same threats.

More:

Quantum Alliance Initiative urges US government to collaborate on universal quantum computer R&D - TelecomTV

Posted in Quantum Computing | Comments Off on Quantum Alliance Initiative urges US government to collaborate on universal quantum computer R&D – TelecomTV

Global Quantum Computing in Health Care Market 2022 Trending Technologies, Developments, Key Players and Forecast to 2028 Queen Anne and Mangolia…

Posted: at 10:30 pm

The study report Global Quantum Computing in Health Care Market from 2022 to 2028 by MarketandResearch.biz is a beautiful blend of brilliant insights, intelligent solutions, concrete proposals, and cutting-edge technology that provides a functional market landscape. Based on professional and extensive research, the global Quantum Computing in Health Care market predictions are 2022-2028.

The target of the report is to provide readers with information. It assists them in getting a better view of the regional activity of the Quantum Computing in Health Care market by measuring replacement merchandise threats, competition intensity, marketable passion, supplier relationships strength, and the markets bene?ts, deficiencies, dangers, and potentially dangerous opportunities.

The essential viewpoint in the Quantum Computing in Health Care sector is assessed, and the variables that will power the industrys growth are identified. The paper looks at past growth trends, current growth factors, and future strategic value. SWOT analysis and other strategies are used to analyze this data to provide an accurate opinion on the industrys status, which is used to facilitate the delivery of an optimal growth plan for any competing product or anticipate potential shape and advancement of the Quantum Computing in Health Care industry.

DOWNLOAD FREE SAMPLE REPORT: https://www.marketandresearch.biz/sample-request/191906

The examination of the following companies has been mentioned in the report:

Market segmentation by type:

Market segmentation by application:

Regions & countries in the global Quantum Computing in Health Care market report:

ACCESS FULL REPORT: https://www.marketandresearch.biz/report/191906/global-quantum-computing-in-health-care-market-growth-status-and-outlook-2021-2026

In addition to clear and precise information and income projections, the report includes essential sales statistics and information about market sellers and dealers. The study also includes a summary of terminal industries and their consumers who buy. Qualitative research methods such as Porters five forces analysis, SWOT analysis, and PESTEL analysis are included in the paper. A comprehensive assessment of the global Quantum Computing in Health Care market and a novel research approach were used. Standard reference interviews were conducted with product portfolio managers, senior executives, vice presidents, and CEOs who significantly contributed to the report.

Customization of the Report:

This report can be customized to meet the clients requirements. Please connect with our sales team (sales@marketandresearch.biz), who will ensure that you get a report that suits your needs. You can also get in touch with our executives on 1-201-465-4211 to share your research requirements.

Contact UsMark StoneHead of Business DevelopmentPhone: 1-201-465-4211Email: sales@marketandresearch.biz

Excerpt from:

Global Quantum Computing in Health Care Market 2022 Trending Technologies, Developments, Key Players and Forecast to 2028 Queen Anne and Mangolia...

Posted in Quantum Computing | Comments Off on Global Quantum Computing in Health Care Market 2022 Trending Technologies, Developments, Key Players and Forecast to 2028 Queen Anne and Mangolia…

The Ecosystem: Finland punches above its weight in quantum – Science Business

Posted: at 10:30 pm

Finns joke that their advantage in quantum computing is that the cold you need to run the processors comes for free. But make no mistake, the quantum ecosystem in Finland is heating up.

Helmi, a five-qubit computer inaugurated last November in Espoo, will this month connect to the LUMI supercomputer in Kajaani, making blended computing projects possible. And in April, the country inked a cooperation statement with the US for quantum information science and technology, the first such agreement with a country in mainland Europe.

That statement gives us credibility that we are a strong partner to work with, says Himadri Majumdar, who leads the quantum programme at state-owned research centre VTT. While weve had academic collaborations with the US for a long time, this opens up commercial opportunities for Finnish and US companies to collaborate and find solutions that are useful for both sides.

One concrete effect is that Finland has been endorsed for cooperation with the Quantum Economic Development Consortium (QED-C), a body dedicated to the growth of the US quantum industry. The QED-C is only open to a few European member states and, thanks to the statement, Finland has been selected to be one of them, says Jan Goetz, chief executive and co-founder of quantum computer start-up IQM. Other benefits are expected to follow, with public funding for collaboration high on the wish list.

Finlands pitch in quantum is that it has a complete ecosystem. We have all the components in place, in a concentrated area, says Mikael Johansson, quantum strategist at CSC, Finlands IT Centre for Science. Being small has helped, with collaboration the norm across disciplines, and between academia and industry. Maybe that has been out of necessity, because we have limited resources to work with; but in the case of quantum technologies this is really an asset. We havent been siloed within the country, so we all work together and can see the broader picture.

IQM is a cornerstone of the ecosystem. Set up in 2018 by researchers from Aalto University and VTT, it builds quantum processors for research labs and supercomputing data centres. It now employs over 160 people, at four locations across Europe. Another is Bluefors, set up in 2008 to commercialise a cryogen-free ultra-low temperature system developed at Aaltos predecessor, Helsinki University of Technology. Achieving these low temperatures is essential for building quantum computers and other devices. The company now has over 250 employees, and an annual revenue of approximately 80 million.

Building on five qubits

IQM and VTT built Helmi, the five-qubit quantum computer inaugurated last November in Espoo. Five qubits is relatively modest compared to other projects: IBM last year turned on a machine boasting more than 100 qubits. But Majumdar says Helmi is just the beginning of Finlands quantum journey. Upgrades are expected to 20 qubits in 2023, and to 50 qubits in 2024.

You can run very simple algorithms, so it is for research and education rather than offering commercial benefits. But it is crucial for getting the feel of how a quantum computer works, says Juha Vartiainen, chief operating officer at IQM, and another of its four co-founders. The aim is to use this infrastructure to energise the ecosystem. Goetz draws an analogy with Britains high-performance computing ecosystem around Cambridge, where powerful computing infrastructure stimulated the start-up scene. And thats what we seeing, with start-ups being born here or relocating to Finland.

One example is QuantrolOx, a spin-off from the University of Oxford that has come to Espoo to build its qubit control software. Founded in 2021, the company raised 1.4 million in seed funding this February to further develop its business. The company can improve its product with the help of this quantum computer, says Vartiainen. On top of that, a deal announced in April between QuantrolOx and Indian quantum and artificial intelligence company QpiAI will result in the latter opening an office in Finland.

Meanwhile the Indian IT company Tech Mahindra is to set up a quantum centre of excellence in Helsinki, with the goal of creating 200 technology and business jobs over the next five years. This can be a kind of incubation centre for quantum algorithm development, says Majumdar, who was part of the trade delegation that sealed the deal. You can argue that you can do the computing in the cloud, using systems that are already available, but having access to a machine and actual hardware, where you can do even low-level software development, is a unique opportunity.

In addition to the hardware, Finlands assets for start-ups include plenty of talented engineers, and a strong venture capital community. You have events like Slush (a high-profile, annual tech exhibition in Helsinki), and a very good network of people who bring money to the table, says Goetz. There are also plenty of good ideas waiting to be exploited. Theres quite a build-up of intellectual property in the universities and VTT, so in terms of spinning out, there is a lot to build companies around, says Vartiainen.

Quantum meets supercomputing

Having an operational quantum computer will also help bring quantum and traditional high-performance computing together. Even though the quantum processor is small, its a real device, with real properties and real behaviour, that we can now integrate with the pan-European LUMI supercomputer, hosted in our data centre, says Johansson. Having it there means we can start doing things that were not possible before. We can start developing the software stack and algorithms, and we can get understanding of how it fits into the workflow for real end-user problems.

These end-users are the one gap in Finlands quantum ecosystem. We want them to get engaged as soon as possible in quantum activities, but there is a threshold that needs to be crossed, says Majumdar. Some of them think this is too far off, that they can wait for it to evolve. To this end VTT is setting up a foresight programme to help companies see beyond the threshold. We can help them identify what they can do in their specific industry, at each qubit capacity progression.

This search for end-users is one reason that IQM has expanded beyond Finland, opening offices in Munich, Bilbao and most recently Paris. In places like Munich, for example, you have a very high density of big industry players who have their quantum teams there, says Goetz. Its a different kind of ecosystem, not focused so much on the systems, but more on use cases. But its roots in Finland remain strong, with the European Investment Bank announcing last week that it is putting 35 million into the companys new processor fabrication facility in Espoo.

Visit link:

The Ecosystem: Finland punches above its weight in quantum - Science Business

Posted in Quantum Computing | Comments Off on The Ecosystem: Finland punches above its weight in quantum – Science Business

Hyperion Research Expands Analyst Team – HPCwire

Posted: at 10:29 pm

ST. PAUL, Minn., May 2, 2022 Responding to the companys steady year-over-year growth and new business opportunities, including over 20 new clients this past year alone, Hyperion Research, the leading industry analyst and market intelligence firm for high performance computing (HPC), AI, cloud, quantum, and associated emerging markets, is adding new analysts and announcing promotions among existing staff members.

New analyst:

Promotions:

Expanded roles:

Hyperion Research looks forward to continuing to help worldwide organizations make effective decisions and seize growth opportunities by providing superior data-based market research and analysis on HPC, AI, cloud, big data, quantum computing and emerging technologies.

About Hyperion Research

Hyperion Research is the premier industry analyst and market intelligence firm for high performance computing (HPC) and associated emerging markets. Hyperion Research analysts provide timely, in-depth mission-critical insight across a broad portfolio of advanced computing market segments, including High Performance Computing (HPC),Advanced Artificial Intelligence (AI), High-Performance Data Analysis (HPDA),Quantum Computing, Cloud and Edge Computing.

Hyperion Research provides data-driven research, analysis and recommendations for technologies, applications, and markets to help organizations worldwide make effective decisions and seize growth opportunities. Research includes market sizing and forecasting, share tracking, segmentation, technology, and related trend analysis, and both user and vendor analysis for multi-user technical server technology used for HPC, AI, Cloud, Quantum and HPDA (high performance data analysis). The company provides thought leadership and practical guidance for users, vendors, and other members of the global HPC community by focusing on key market and technology trends across government, industry, commerce, and academia.

The industry analysts at Hyperion Research have been at the forefront of helping private and public organizations and government agencies make intelligent, fact-based decisions related to business impact and technology direction in the complex and competitive landscape of advanced computing and emerging technologies for more than 25 years.

Source: Hyperion Research

Read the original:

Hyperion Research Expands Analyst Team - HPCwire

Posted in Quantum Computing | Comments Off on Hyperion Research Expands Analyst Team – HPCwire

Building the Future We Deserve A Cyber Success Story – Security Today

Posted: at 10:29 pm

Building the Future We Deserve A Cyber Success Story

Consider a conventional computer. It uses a small (64-bit) processor architecture and is considered excellent for solving linear problems. Many past and present problems are linear, and 64-bit architectures have been sufficient to solve them (a 64-bit register can hold any of 264 over 18 quintillion [or 1.81019] different values). However, if you want to solve a much more complex problem such as those that occur in natural chemistry and physics, using a linear approach is not possible due to the massive numbers and variables that must be considered to reach a solution. Conventional computing and linear problem-solving approaches are quickly overwhelmed by this complexity.

Enter a quantum processor that harnesses bits that are atoms or subatomic particles. Because of the nature of quantum mechanics, those bits can represent anything (e.g., 0,1, or anything in between) and potentially exist anywhere in space. If you connect those bits with entanglement into a circuit, for example a 73 quantum bit (qubit) circuit, the word size is now 2 to the 73rd power (273). This works out to be a yottabit of data, which is equivalent to all the data stored in the world in the last year. Imagine a computer that can process all the data stored in the world in the last year in a single instruction.

This computational capability is amazing for operations such as molecular science, neural networks, and weather simulation. As another point of reference, you have about a trillion neurons in your brain. Think about interrogating the whole state of a complex neural network like your brain into one instruction. This is possible in the future using quantum computers. It is fascinating, and it will open us up to huge breakthroughs in technology, science and nature.

This fantastic computational power is a double-edged sword, however. The problem is that our current public encryption (think the entire internet) is based on a single transaction factoring a large prime number. Quantums large word sizes are great for factoring large prime numbers, rendering much of our current cryptographic capabilities useless. Also, the current cryptography on nearly all electronic devices, whether a watch, phone, computer, or satellite, is based on the same prime number factorization. So far, factoring a significant prime number on a conventional computer is still extremely difficult. But quantum computers pose a threat because they can do it quickly.

Read the rest here:

Building the Future We Deserve A Cyber Success Story - Security Today

Posted in Quantum Computing | Comments Off on Building the Future We Deserve A Cyber Success Story – Security Today

Research could lead to the development of new superconductors | Binghamton News – Binghamton

Posted: at 10:29 pm

From MRI machines to particle accelerators and Maglev trains, superconductors have revolutionized modern technology and they have the potential to do so much more.

The main property of a superconducting material is that it can conduct electricity without resistance when cooled below a certain material-dependent critical temperature, explained Binghamton University Associate Professor of Physics Elena Roxana Margine.

This amazing quality, however, comes at a cost: The most commonly used niobium-based superconductors operate at extremely low temperatures around 10 degrees Kelvin, equivalent to 442 degrees Fahrenheit or 263 degrees Celsius.

For the past 50 years, scientists have been searching for superconductors that can work at higher critical temperatures ideally room temperature, although 100 degrees Kelvin (173 degrees Celsius or 279 degrees Fahrenheit) is acceptable for a wide range of applications. Unfortunately, the high-temperature superconductors already discovered are difficult to manufacture. Copper oxide-based superconductors are ceramic compounds, for example, which are brittle and difficult to fabricate into wires, while hydrogen-based superconductors can only be synthesized under extremely high pressure so high, in fact, that its similar to pressures found close to Earths core.

Margines work in computational physics could potentially lead to breakthroughs in this field. Last summer, she received three National Science Foundation (NSF) grants to aid that effort.

A $3.86 million grant from NSFs Office of Advanced Cyberinfrastructure will help develop a comprehensive software ecosystem to model and predict advanced functional properties of materials by using many-body electronic structure methods. Margine is one of several co-principal investigators (PIs) on the grant, which is led by Feliciano Giustino from the University of Texas at Austin; Binghamtons portion of the grant is $838,500.

The goal of this project is to expand and combine the complementary strengths of three software packages developed by the PIs of this grant and built-in compatibility layers for major density-functional theory codes, Margine explained. This cyberinfrastructure, in turn, will allow scientists to perform systematic and predictive calculations of properties that underpin the development of next-generation materials for energy, computing and quantum technologies.

Margine is the sole principal investigator for a $400,000 continuing grant from the NSFs Division of Materials Research that will allow her to implement new capabilities for modeling superconducting materials.

Another $226,947 grant from the Division of Materials Research will aid the search for superconducting materials that can operate at a higher critical temperature. The team, led by Margine and Associate Professor of Physics Alexey Kolmogorov, will explore promising combinations of boron, carbon and various metals, using advanced modeling methods and computational tools. Kolmogorov will use a combination of evolutionary algorithms and machine learning methods to identify synthesizable compounds, while Margine will investigate the most suitable candidate materials with potential for high-temperature superconductivity. Thats not as simple as opening a laptop, however.

Superconductivity is a complex process determined by the interaction between electrons and atomic vibrations in a material. Accurately modeling this interaction not only takes complex computer codes and calculations, but immense processing power.

In order to run calculations like this, you need supercomputers, Margine said.

For the past few years, Margine has used Expanse cluster at the San Diego Supercomputer Center; this year, she also was awarded resources to use the Frontera supercomputer at the Texas Advanced Computing Center.

The grants also support the training of undergraduate and graduate students, as well as postdoctoral researchers in computational materials science and high-performance computing. These grants will also contribute to the development of a more diverse and inclusive STEM workforce by organizing annual schools for users of the codes, Margine said. One such training session will be held this June at the University of Texas at Austin.

Through computational modeling, researchers may be able to predict which materials would excel as superconductors, particularly those that can operate at higher critical temperatures. Understanding how they work at the atomic level could someday lead to innovations in energy storage, medicine, electronics, transport and even quantum computing.

What we are trying to do is develop methods with improved prediction capabilities that will pave the way for rational design of new superconductors, Margine said.

Link:

Research could lead to the development of new superconductors | Binghamton News - Binghamton

Posted in Quantum Computing | Comments Off on Research could lead to the development of new superconductors | Binghamton News – Binghamton

Page 41«..1020..40414243..5060..»