How does the genomic naive public perceive whole genomic testing for health purposes? A scoping review | European Journal of Human Genetics -…

Study characteristics

Sixteen studies were included in the analysis [46,47,48,49,50,51,52,53,54,55,56,57,58,59,60,61]. Most were quantitative (n=12), using questionnaires to assess public perceptions [46,47,48,49,50,51,52, 54, 56, 57, 59, 61]. Three studies conducted focus groups [53, 55, 60] while one study used both focus groups and a survey [58]. The US has contributed the most to this field thus far, undertaking six of the 16 studies identified in the literature search [49, 50, 52, 53, 55, 58]. This is followed by Canada (n=2) [48, 51] and Japan (n=2) [54, 59]. Each of the following countries contributed one study: Jordan [56], Korea [57], The Netherlands [61], Singapore [60], Qatar [46] and the UK [47]. Ten of the studies attempted to recruit a representative sample [46,47,48,49, 51, 53, 54, 56, 59, 61]. Higher educated participant populations (compared to the general population) were noted in four studies [48, 59,60,61]. Three studies recruited participants from specific sites [52, 55, 57]. No studies attempted to discern the views of underrepresented populations aside Mallow et al. [58] who conducted focus groups with a rural community (Table3, Supplementary File3).

Education level influenced decisions to hypothetically partake in genomic testing in different ways [49, 51, 56, 59, 61]. Three studies found that more educated individuals were more likely to be interested in testing [49, 56, 59], while two other studies found that being more educated led to more critical attitudes towards testing [51, 61]. One study found no association between education level and attitude towards testing [57]. Khadir, Al-Qerem and Jarrar [56] found that having a low perceived knowledge of genomic testings social consequences reduced the likelihood of having a reserved attitude. Abdul Rahim et al. [46] found genetic/genomic knowledge did not impact whether a participant would engage in testing.

The age of the participant was reported to influence decision making [49, 54, 56, 57, 59], with no consensus on attitudes of older versus younger adults. Lee et al. [57] found that older adults were more likely to approve of integrating personalised medicine testing into standard healthcare. Two other studies also found that older adults were slightly more interested in genomic testing [54, 56]. In contrast, Okita et al. [59] found that older adults were less willing to partake in genomic testing, while Dodson et al. [49] found no association between age and likeliness to have testing.

Abdul Rahim et al. [46] found that marital status was not significantly associated with willingness to partake in testing in Qatari adults, while Dodson et al. [49] found American participants planning to have children in the next five years had significantly increased interest in testing. Dodson et al. [49] was the only study to investigate whether ethnicity influenced decision-making, showing no association.

Okita et al. [59] assessed the influence of employment status on willingness to partake, reporting that students had significantly more positive attitudes towards testing compared to employed respondents. Bombard et al. [48] found that having an income of more than CAD$80,000 led to a 11-12% decrease in likeliness of believing parents have a responsibility to have their child tested via expanded NBS. No study assessed the impact of sex on attitude towards testing, however Lee et al. [57] found that sex did not significantly influence whether the participant had heard of personalised medicine.

Using the NASSS domains we were able to map primary source data to technology (Domain 2), value proposition (Domain 3), the adopter system (Domain 4) and the wider context (Domain 6) (Fig.2). Greenhalgh et al. [39] does not provide specific definitions for their domains, rather they frame these domains in the form of questions that need to be answered. We replicated this approach and adapted the questions to align with our study questions (Supplementary File4).

The NASSS Framework considers the influences on adoption, nonadoption, abandonment, spread, scale-up, and sustainability of healthcare technologies. Domains 2 (Technology), 3 (Value proposition), 4 (Adopter system) and 6 (Wider context) of the NASSS Framework have been addressed in this scoping review to consider how public perceptions are incorporated in the framework.

Domain 2 considers the technical aspects of the technology that will influence its implementation [39]. Questions 2B, types of data generated; 2C, knowledge needed to use the technology; and 2E, Who owns the IP generated by the technology?, are addressed in the primary sources.

This question considers the knowledge generated by the technology and how this is perceived by patients and/or caregivers. Two studies cited the accuracy of genetic information as an issue for their participants [54, 58].

Greenhalgh et al. [39] defines this as the type of knowledge needed by both healthcare providers and patients to use the technology. However, we will only focus on the views of the general public. Although patients of genomic testing do not necessarily need knowledge to undertake testing, the informed consent process is essential. To gain informed consent from patients, understanding the baseline genomic knowledge of the public is beneficial for those taking consent. Knowledge of genetics and genomics was assessed in several different ways across the included articles [46, 52,53,54, 56, 58, 60]. These included asking participants if they had heard of various genetic and/or genomic terms, how they had heard about genomic testing, how participants describe genomics (in a focus group setting) and questions on genetics knowledge.

Abdul Rahim et al. [46] found that less than a third (n=245) of survey respondents had heard of genomic testing while just over half (n=447) had heard of genetic testing. Gibson, Hohmeier and Smith [52] found that 54% (n=7) of their participants had heard the term pharmacogenomics. Hishiyama, Minari and Suganuma [54] found that more than two-thirds of their participants had heard of classic genetic terminology (e.g. DNA, gene, chromosome), whereas fewer participants had heard of newer, genomics terminology (e.g. personal genome and pharmacogenomics). Hahn et al. [53] found that the majority of their participants had not heard the term genomic medicine and personalised medicine. Ong et al. [60] found that English and Mandarin-speaking participants had heard of the term personalised medicine but not precision medicine, while Malay-speaking participants had not heard of either term.

Three studies questioned participants on how they had heard about genomics [46, 52, 53]. Abdul Rahim et al. [46] asked about both genetic and genomic testing whereas Gibson, Hohmeier and Smith [52] asked their participants where they had heard certain terms from. Abdul Rahim et al. [46] found that 30% (n=69) of participants who knew of genomic testing, heard about it through word of mouth. Gibson, Hohmeier and Smith [52] found that 54% (n=7) of participants had heard of pharmacogenomic testing, and other key terms associated with genomics, from the internet. Hahn et al. [53] used focus groups to discern participant understanding of the term genomic medicine, and found that some college students had heard of the term on the news and in biology classes.

Two studies used focus groups to discern genomic understanding [53, 58]. Mallow et al. [58] used a Community Participating Research approach. Community leaders suggested they use terms like genes and family health history rather than scientific terminology to assist discussions with the community. They found that participants were more likely to describe inheriting disease rather than inheriting health and wellness [58]. Hahn et al. [53] found that their focus group participants described genomic medicine in terms of genetics, family history, the genome project, using genetics to heal people and cloning. Ong et al. [60] also used focus groups to discuss baseline understanding of personalised medicine and precision medicine divided into the primary language spoken by the participants, allowing for discussions on terminology specific to the language.

Knowledge of genetic and/or genomic facts was directly assessed in two studies [46, 56]. Abdul Rahim et al. [46] and Khadir, Al-Qerem and Jarrar [56] both questioned respondents on their basic genetic literacy via survey questions. Abdul Rahim et al. [46] found that 56.1% of survey respondents (n=464) were able to answer at least 5 out of 8 genetic literacy questions correctly, while Khadir, Al-Qerem and Jarrar [56] found that participants were knowledgeable in hereditary genetic information but not other scientific facts. Khadir, Al-Qerem and Jarrar [56] also gave participants the opportunity to self-report their knowledge of genetics. Many participants reported having sufficient knowledge on basic medical uses of testing and potential social consequences, such as refusing testing and the rights of third parties to request genetic test results of individuals [56].

For genomic testing, we have interpreted this question to mean whether patients own their genetic information or if it belongs to the group that conducts sequencing. Four studies found that participants had concerns about the privacy of their or their childs genetic information [46, 53, 55, 57]. Hishiyama, Minari and Suganuma [54] also found that 37.1% (n=1112) of their participants were concerned about management and storage of genetic information.

Greenhalgh et al. [39] use this domain to consider the value placed on the technology by healthcare professionals and the patient. Question 3B, demand-side value (to patient), is addressed in the primary sources.

Greenhalgh et al. [39] define this question as the downstream value of the technology, including the evidence of benefit to patients and affordability. Willingness to pay for genomic testing was directly assessed in three studies [50, 52, 57]. Gibson, Hohmeier and Smith [52] found that if the entire cost of the pharmacogenomic test was covered by insurance, 89% of participants (n=24) would undertake testing [52]. Lee et al. [57] determined that age, gender, income, inconvenience of testing and prior knowledge all influenced whether participants would pay extra for personalised medical testing. Cost of testing was a concern for 44.8% of participants (n=316) [57]. Edgar et al. [50] found that most adoptees (72.4%) and non-adoptees (80.3%) were willing to pay between US$1 and US$499. Education level was a predictor for adoptee willingness to pay, while income predicted willingness to pay in non-adoptees [50]. Abdul Rahim et al. [46] did not directly assess willingness to pay, however they noted that a high income was associated with participant willingness to partake in testing.

Hahn et al. [53] and Ong et al. [60] did not directly assess willingness to pay for genomic sequencing, but participants did express concerns about the cost of testing to the individual and whether there would be equitable access to testing.

Greenhalgh et al. [39] use this domain to consider the adoption of the technology. The adopter system includes caregivers, healthcare professionals and patients. Question 4B addresses whether patients will adopt a technology, while 4C addresses if lay caregivers are available to facilitate adoption. As we did not include patients or lay caregivers in our review, we have adapted these definitions to incorporate hypothetical patients and/or carers under the term genomic naive public. Greenhalgh et al. [39] also emphasise patient acceptance and family conflict as influencing factors on use of technology.

Several personal values were identified across the included studies [46, 48,49,50,51,52,53,54, 56, 59]. Abdul Rahim et al. [46] and Hishiyama, Minari and Suganuma [54] found that contributing to science and medical research were reasons to partake [46, 54]. Other reasons for partaking in genomic testing suggested by Qatari adults included improved health knowledge and prevention of future health conditions [46]. This was also suggested by participants in Etchegary et al. [51], Hahn et al. [53] Khadir, Al-Qerem and Jarrar [56].

Bombard et al. [48] found that most of their participants preferred using scientific evidence (82%, n=994) and receiving expert advice (74%, n=897) when making important healthcare decisions. However, only half (53%) of participants had trust in healthcare (n=639). Hahn et al. [53] also found that many participants were sceptical of genomic medicine specifically, and often associated it with genetic engineering and cloning despite these not being directly related to genomic testing. Some participants felt they did not need the information genomic testing could provide, while others who would hypothetically want testing, believed it could promote the development of new treatments and provide more information on family history [53].

Primary reasons for not willing to partake in testing, as noted by Abdul Rahim et al. [46] were lack of time, information or knowledge, and privacy concerns. Similar concerns were suggested by Hahn et al. [53] and Lee et al. [57]. Fear of the unknown was also suggested in Hahn et al. [53] and Mallow et al. [58]. Participants in Hahn et al. [53] also noted they may be uncomfortable with the results, and the results may be too deterministic.

Aside from general concerns about the nature of genomic testing, concern regarding communication of genetic information among family members was also highlighted [47, 51, 53, 56, 58, 61]. Ballard et al. [47] noted that most participants, whether asked to imagine either they or a family member had a genetic condition, believed other family members who might also be affected should be notified. Etchegary et al. [51] and Khadir, Al-Qerem and Jarrar [56] also found that most participants would share genomic test results with family members. Participants in Hahn et al. [53] generally had a positive view of learning about genetic information if it would help other family members as some had family members who had passed away without explanation. Mallow et al. [58], however, found that communicating genetic information to family members may be an issue. Participants cited several reasons for this including: upsetting children and the creation of family issues, older family members not willing to disclose information and stigmatisation by the community, particularly if the information in question regarded mental illness or substance abuse disorders [58]. Participants also suggested they would only discuss genetic risk if there was a health crisis in the family [58]. Etchegary et al. [51], although noting that many participants would want to share information, found that those with the highest education levels and income were less likely to share results with family members. Vermeulen et al. [61] also found that 17% of their participants (n=160) were worried about causing friction within their families. However, participants who believed family history assessments were worthwhile cited disease prevention as a benefit to involving family members [61].

Greenhalgh et al. [39] describe the wider context as the institutional and sociocultural contexts. Examples of the wider context include health policy, fiscal policy, statements and positions of professional and peak bodies, as well as law and regulation. Here, in order to respond to our research questions, we focus on the socio-cultural aspects of the public.

Societal concerns were noted in many studies [51, 53,54,55,56, 58, 60, 61]. Twenty-two percent of participants (n=1425) in the Hishiyama, Minari and Suganuma [54] study noted employment and insurance discrimination as a concern. This was also noted in Etchegary et al. [51] and Khadir, Al-Qerem and Jarrar [56]. Participants in Hahn et al. [53] and Mallow et al. [58] noted discrimination and segregation as key societal issues that may arise. One-third of participants (n=311) in Vermeulen et al. [61] thought that individuals may be coerced into testing if it is normalised.

Cultural context may influence participant responses. For example, Abdul-Rahim et al. found the 45.1% of their respondents (n=241) were in consanguineous relationships [46]. No other study reported on consanguinity, demonstrating that different cultures prioritise different elements when reporting. Abdul-Rahim et al. found that 70.9% population (n=584) were willing to undergo genomic testing [46], whereas Dodson et al. found that 39.5% of their US population (n=805) were somewhat interested and 19.1% (n=389) were definitely interested in genomic testing [49]. These papers demonstrates that different cultures can influence perceptions of genomic testing. However, the Caucasian US population in Gibson et al. were more willing to undergo testing at 81.0% (n=21) [52], showing that even within the same country there can be cultural differences that may lead to differences in perception.

Read more here:
How does the genomic naive public perceive whole genomic testing for health purposes? A scoping review | European Journal of Human Genetics -...

Global Genetic Testing Market Research Report 2022 Featuring Major Players – Abbott Laboratories, Myriad Genetics, F. Hoffmann-La Roche, Illumina, and…

DUBLIN--(BUSINESS WIRE)--The "Global Genetic Testing Market Research and Forecast, 2022-2028" report has been added to ResearchAndMarkets.com's offering.

The global genetic testing market is growing at a significant CAGR during the forecast period. The genetic disorder can be occurred by a change in one gene (monogenic disorder), by changes in multiple genes by a combination of environmental factors, and gene mutations, or by the destruction of chromosomes. Genetic testing is a medical test that is used for the identification of mutations in genes or chromosomes.

The key benefit of genetic testing is the chance to know the risk for a certain disease that possibly can be prevented, identify the disease or a type of disease, identify the cause of a disease, to determine options for a disease. The disease that can be identified by genetic testing includes, breast and ovarian cancer, Age-Related Macular Degeneration (AMD), bipolar disorder, Parkinson's disease, celiac disease, and psoriasis.

The global genetic testing market is projected to considerably grow in the upcoming year due to the prevalence of genetic disorders, cancer, and chronic disease. Moreover, continuous advancement by the medical companies in the genetic diagnostic field is also augmenting the market growth.

These companies are finding new and better tests for the accurate diagnosis of the most prevalent as well as rare diseases. Besides, the increase in awareness between people about health and the increased mortality rate due to genetic diseases across the globe is also a major factor increasing the need for demand for genetic testing.

Moreover, The adoption of (DTC) direct-to-consumer genetic testing kits in countries such as the US, China, and Japan, is increasing rapidly. With growing technological acceptances, awareness programs, and a drop in costs, the market for DTC-GT kits is likely to witness a significant boost over the forecast period. However, the lack of diagnostic infrastructure in emerging economies is a challenging factor for market growth.

Regional Outlooks

North America is estimated to contribute a significant share in the global genetic testing market due to the high awareness among the people about advanced treatment for healthcare, well-developed healthcare infrastructure, presence of key players, and availability of drugs.

Moreover, an increase in government initiatives for the enhancement of healthcare facilities and funding in research in the region is also a major factor for the significant market share of the region. In the US under the US CDC EGAPP, inventiveness has been taken by the government such as the Evaluation of Genomic Applications in Practice and Prevention which is also motivating the market growth.

One of the key goals of the initiative is to timely, offer objectively, and credible information that is linked to available scientific evidence. These statistics will allow healthcare workers and payers, customers, policymakers, and others to differentiate genetic tests that are safe and useful.

Asia-Pacific will have considerable growth in the global Genetic Testing Market

In Asia Pacific, the market is increasing due to government initiatives in research and the increasing prevalence of chronic diseases. Apart from cancer, genetic testing processes have also come in easy reach for the diagnosis of inherited cardiovascular diseases such as cardiac amyloidosis, Brugada syndrome, and familial dilated cardiomyopathy. As the region has a high incidence of cardiovascular diseases, significant scope for genetic testing can be witnessed in the region during the forecast period.

Market Players Outlook

The report covers the analysis of various players operating in the global genetic testing market. Some of the major players covered in the report include Abbott Laboratories, Myriad Genetics, Inc., F. Hoffmann-La Roche Ltd., Illumina, Inc., and Thermo Fisher Scientific, Inc. To survive in the market, these players adopt different marketing strategies such as mergers and acquisitions, product launches, and geographical expansion.

The Report Covers

Market Segmentation

Global Genetic Testing Market by Technology

Global Genetic Testing Market by Type

Global Genetic Testing Market by Disease

Company Profiles

For more information about this report visit https://www.researchandmarkets.com/r/fvmuud

About ResearchAndMarkets.com

ResearchAndMarkets.com is the world's leading source for international market research reports and market data. We provide you with the latest data on international and regional markets, key industries, the top companies, new products and the latest trends.

More:
Global Genetic Testing Market Research Report 2022 Featuring Major Players - Abbott Laboratories, Myriad Genetics, F. Hoffmann-La Roche, Illumina, and...

Tracking Transcripts in Biologics and Cell Therapies – Genetic Engineering & Biotechnology News

The outcomes from biologics and cell therapies hinge on what they secrete. More specifically, protein secretion has important impacts for both the quality and quantity of a therapeutic produced using bioprocessing, says Dino Di Carlo, PhD, the Armond and Elena Hairapetian chair in engineering and medicine at the University of California, Los Angeles.

For biologics, he continues, the rate at which producer cells secrete protein therapeuticsfor example, monoclonal antibodiesdrives the amount of therapeutic that can be produced per batch and ultimate costs of production; for cell therapies, secreted proteins are a key product attribute that defines a high-quality product.

A previous GEN story explained how secretion-based screening could improve cell therapies. Here, Di Carlo describes how he and his colleagues used single-cell sequencing information (SEC-seq) to link the secretions and transcriptomes for individual antibody-secreting cells.

The key finding from this work, Di Carlo says, is that gene transcripts are not necessarily correlated to secreted proteins, which makes the community rethink genetic modificationapproaches that just focus on overexpressing the gene for the target-secreted protein to achieve animproved therapeutic effect. For example, the SEC-seq study showed that the levels of mRNA transcripts for immunoglobulin G (IgG) proteins did not correlate with the amount of assembled and secreted IgGheavy and light chainin humanplasma cells.

Instead, in highly secreting cells, we found pathways upregulated that drive energy production, protein translation, protein trafficking, and response to misfolded proteins, Di Carlo explains. This suggests that having enough transcripts around to produce the secreted protein is not the bottleneck for high levels of secretion. As he adds, Other pathways are the likelybottleneck, and they are needed to make and traffic a lot of protein to the membrane to secrete it, as well as deal with mis-folded proteins that result from translation of large quantities of proteins.

Upon understanding the importance of secretions from biologics and cell-based therapies, what can a commercial bioprocessor do about it? One implication is for bioprocessors interested in the genetic modification of therapeutic cells to secrete more of a therapeutic protein, Di Carlo says. Our results suggest it is not sufficient to genetically modify your cell type of interest with just the gene to produce a secreted protein, because you may also need to drive these other associated pathways to enhance the level of secretion.

The SEC-seq method could also be applied in other ways. Bioprocessors could use this technique to identify the pathways that drive secretion of critical cytokines and growth factors for their therapeutic cell type of interest, Di Carlo says. For example, there may be differences in what drives high levels of secretion between the human plasma cells secreting IgG and natural killer cells secreting cytokines. This information could be used by a bioprocessor to improve the base cell types used for a therapeutic product or to perform quality control or production-batch sorting based on these factors.

For biologics, bioprocessors can use SEC-seq to uncover what drives higher secretion of a therapeutic protein by producer cell types, like CHO cells, HEK293 cells, etc., Di Carlo explains. The information obtained could help engineer the next generation of more efficient and productive producer cell lines.

So, making an effective biologic or cell therapy depends on what cells secrete. Fortunately, Di Carlos work will help bioprocessors better understand and improve that secretion.

Continued here:
Tracking Transcripts in Biologics and Cell Therapies - Genetic Engineering & Biotechnology News

Empyrean Neuroscience Launches with $22M Series A and Genetic Engineering Platform to Advance Pipeline of Neuroactive Compounds Targeting CNS…

NEW YORK & CAMBRIDGE, England--(BUSINESS WIRE)--Empyrean Neuroscience, Inc., a leading genetic engineering company dedicated to developing neuroactive compounds to treat neuropsychiatric and neurologic disorders, today announced that it has launched with a $22 million Series A financing and a genetic engineering platform to advance a pipeline of neuroactive compounds targeting disorders of the central nervous system (CNS). The company is founded on a proprietary platform designed to genetically engineer small molecule therapeutics from fungi and plants. Veteran biotech executives Usman Oz Azam, M.D., Chief Executive Officer, and Fred Grossman, D.O., FAPA, Chief Medical Officer, lead the company.

Through precision targeting and engineering of the fungal and plant genomes, Empyrean is working to enhance and modulate neuroactive compounds produced by these kingdoms. The platform is being used to identify therapeutic fungal alkaloids, cannabinoids, and other small molecules that may exhibit enhanced efficacy and safety. In addition, the platform is designed to discover novel small molecules that may exhibit a therapeutic benefit.

There is an enormous medical need for safe and effective therapeutics that treat neuropsychiatric and neurologic disorders and we believe genetic engineering provides the answer, said Dr. Azam, Empyreans Chief Executive Officer. By applying our genetic engineering platform to make precise modifications to the genomes of fungi and plants, we can change the amount and kind of neuroactive small molecules they produce, with the goal of developing safe and effective treatments for difficult-to-treat diseases of the CNS.

The companys developmental pipeline includes fungal alkaloids, cannabinoids, and other neuroactive compounds, such as N,N-Dimethyltryptamine (DMT), for the potential treatment of major depressive disorder (MDD), post-traumatic stress disorder (PTSD), neurologic disorders, substance abuse and dependence, and chronic pain. Investigational New Drug (IND) enabling studies of the companys first genetically engineered encapsulated mushroom drug product are currently underway, and the company aims to enter the clinic for MDD in 2023.

Fungal alkaloids and cannabinoids have shown promise in treating depression, PTSD, anxiety, and other neuropsychiatric and neurologic disorders, said Dr. Grossman, Empyreans Chief Medical Officer. We believe our approach of genetically engineering fungi and plants can improve their safety and efficacy and will ultimately help to address the substantial unmet medical need in patients who suffer from these diseases.

As part of its genetic engineering platform, the company has licensed CRISPR/Cas9 technology from ERS Genomics for genetic engineering applications related to its therapeutic pipeline.

Dr. Azam was previously President and Chief Executive Officer of Tmunity Therapeutics, a biotech developing genetically engineered cell therapies for applications in cancer. Before Tmunity, he was Global Head of Cell & Gene Therapies at Novartis, where he was responsible for commercial operations, business development licensing, new product commercialization, clinical development, regulatory affairs, and other aspects of the global cell and gene therapies business. He was Chief Executive Officer of Novaccel Therapeutics, Chief Medical Officer of Aspreva Pharmaceuticals, and earlier in his career, held positions at Johnson & Johnson, GSK, and Pfizer. Dr. Azam received his M.D. from the University of Liverpool School of Medicine and is board certified in obstetrics and gynecology in the United Kingdom.

Before joining Empyrean, Dr. Grossman was Chief Medical Officer of Mesoblast Ltd. and President and Chief Medical Officer of Glenmark Pharmaceuticals. He has held executive leadership positions in large pharmaceutical companies, including Eli Lilly, Johnson & Johnson, Bristol Myers Squibb, and Sunovion. He has been responsible for leading the development, approval, and supporting the launch of numerous global medications addressing significant unmet medical needs across therapeutic areas, particularly in the CNS. He has held academic appointments and has authored numerous scientific publications. He was trained in psychiatry at Hahnemann University in Philadelphia and at the National Institute of Mental Health in Bethesda, Maryland and completed a Fellowship in the Section on Clinical Pharmacology at the National Institutes of Health. Dr. Grossman is a board-certified psychiatrist and Fellow of the American Psychiatric Association.

About Empyrean Neuroscience

Empyrean Neuroscience is a genetic engineering company developing a pipeline of neuroactive therapeutics to treat a range of neuropsychiatric and neurologic disorders. Through precision genetic modification, transformation, and regeneration of fungi and plants, the platform allows for the creation of small molecule therapeutics. In addition, the platform enables the discovery of novel small molecules that may exhibit therapeutic properties. The company is based in New York City and Cambridge, UK.

Original post:
Empyrean Neuroscience Launches with $22M Series A and Genetic Engineering Platform to Advance Pipeline of Neuroactive Compounds Targeting CNS...

Depression Treatment: How Genetic Testing Can Help Find the Right Medication – Dunya News

Depression Treatment: How Genetic Testing Can Help Find the Right Medication

Depression Treatment: How Genetic Testing Can Help Find the Right Medication

17 October,2022 08:42 am

ISLAMABAD, (Online) - Thats according to a new studyTrusted Source conducted by the U.S. Department of Veterans Affairs (VA) and published today in the Journal of the American Medical Association.

In it, researchers report that pharmacogenetic testing might help medical professionals by providing helpful information on how a person metabolizes a medication. This information can help doctors and others avoid prescribing antidepressants that could produce undesirable outcomes.

Depression medication is sometimes determined through trial and error to find the best drug and dosage. The researchers say they hope genetic testing can minimize this by giving insight into how a person may metabolize a drug.

Researchers said genetic testing did not show how a person would react to a particular medication but instead looked at how a person metabolized a drug. A drug-gene interaction is an association between a drug and a generic variation that may impact a persons response to that drug. Learning more about drug-gene interactions could potentially provide information on whether to prescribe medication and whether a dosage adjustment is needed.

In the study, around 2,000 people from 22 VA medical centers diagnosed with clinical depression received medications to treat their symptoms. The participants were randomized, with one-half receiving usual care and one-half undergoing pharmacogenetic testing.

For those that received usual care, doctors prescribed medication without the benefit of seeing a genetic testing result. The researchers found that 59 percent of the patients whose doctors received the genetic testing results used medications with no drug-gene interaction. Only 26 percent of the control group received drugs with no drug-gene interaction.

The researchers said the findings show that doctors avoided medications with a predicted drug-gene interaction.

Most often, patients get tested after at least one or two drugs havent worked or they had severe side effects, said Dr. David A. Merrill, a psychiatrist and director of the Pacific Neuroscience Institutes Pacific Brain Health Center at Providence Saint Johns Health Center in California. There are real genetically driven differences in how people metabolize drugs. It helps select more tolerable options to know about their genetics ahead of time.

Researchers interviewed participants about their depression symptoms at 12 weeks and 24 weeks.

Through 12 weeks, the participants who had genetic testing were more likely to have depression remission than those in the control group.

At 24 weeks, the outcome was not as pronounced. The researchers said this showed that genetic testing could relieve depressive symptoms faster than if a person did not receive the testing.

What experts think

There is a place for pharmacogenetic testing when treating people with depression, according to Dr; Alex Dimitriu, an expert in psychiatry and sleep medicine and founder of Menlo Park Psychiatry & Sleep Medicine in California and BrainfoodMD.

Some situations that might call for genetic testing include treatment-resistant depression and more complex cases.

It tells me if someone will either rapidly or slowly metabolize a drug meaning the level of the drug will either be too low or too high depending on the persons metabolism, Dimitriu told Healthline. I have used it in a few rare cases to see what options remain.

To me, more important than pharmacogenetic testing is watching the symptoms and response in my patients, he continued. I see my patients often, especially when starting a new medicine, and we can go slow and watch how the patient is doing. If you start at a low dose and raise the dose slowly, with good monitoring and charting, you can readily see who responds too fast or too slow and at what dose.

Some doctors dont think the science is there yet and arent going to rush into using pharmacogenetic testing based on this study.

I used pharmacogenetic testing about ten years ago and the science is accurate. It tells you the persons genetic makeup, said Dr. Ernest Rasyida, a psychiatrist at Providence St. Josephs Hospital.

From a scientific point of view, he told Healthline, this was a great study. It showed that the doctor used the data 60 percent of the time.

That means that the doctor looked at the data and the medications in the green zone and chose not to use them for side effects or other reasons. Instead, they chose a drug in the red zone because of their clinical experience.

I would argue that if 40 percent of the time you are going to use your judgment and you should use your judgment then why get the test? he concluded.

In addition to depression, pharmacogenetic testing can also be used in the treatment of other non-mental health conditions, such as cancer and heart disease.

Experts say there is no risk to the patient when getting the test and the researchers said they believe it will likely benefit some patients substantially.

Pharmacogenetic results are well-known and have been for years, but the clinical practice of medicine is very conservative, so it takes a long time for clearly beneficial changes to become common practice, Merrill told Healthline. If 15 to 20 percent of patients started on a new drug can avoid a major gene-drug interaction by knowing their results, doing the test seems like a no-brainer to me.

' ;var i = Math.floor(r_text.length * Math.random());document.write(r_text[i]);

Read this article:
Depression Treatment: How Genetic Testing Can Help Find the Right Medication - Dunya News

A Decade of Breast Cancer at the Molecular Level: Pioneering Personalized Medicine – Targeted Oncology

Breast cancer treatment options have significantly expanded in the past decade, welcoming new classes of agents as well as treatments directed at specific patient populations (TIMELINE).

Many believe that these advancements in breast cancer care over the past 10 years owe much to the increased understanding of molecular factors contributing to breast cancer pathogenesis and heterogeneity.1-3

In looking back at the past decade of targeted therapy in breast cancer, Targeted Therapies in Oncology (TTO) spoke with 2 medical oncologists with extensive expertise in breast cancer about how biomarker advancements have transformed the practice of breast cancer management.

I think its fair to say that breast cancer in particular has led the way in molecular therapeutics in oncology, Dennis J. Slamon, MD, director of clinical/translational research at the UCLA Jonsson Comprehensive Cancer Center, told TTO. In part, thats because of all the investment that was made in research [and] because of defi ning this disease not just at a tissue level, but at a molecular level.

Classification of breast cancers into not just hormone receptor and HER2 positivity or negativity, but also into the luminal/basal subtypes has helped to identify treatments that may be more helpful for large groups of patients.1,3 For example, patients with basal-like disease, which is about 15% to 20% of all breast cancers, have triple-negative breast cancer (TNBC) and a poor prognosis. These patients tend to be responsive to chemotherapy treatment.1

The fact that molecular targets did not consistently translate to all breast cancers has become a key underpinning of our understanding of cancer.1 Not all patients benefi t from molecularly targeted treatments. For instance, HER2-positive breast cancer only accounts for about 25% of all breast cancer cases, thus HER2-targeted therapies may only benefi t 25% of all patients with breast cancer.

The same story is coming up again and again, not necessarily the same genes or the same targets or the same pathways, but the fact that there is a diversity of these diseases thats far beyond what we used to use to classify cancers by the tissue in which they arose, Slamon said.

Our understanding of cancer as a potentially more complex disease than previously supposed, began to develop well before 2012, explained Slamon.

That started in breast cancerbefore molecular medicine, as far back as 1899 or 98, when a surgeon recognized the fact that this disease occurred in women and the fact that it may have some hormonal component, said Slamon.4 After we found HER2, the methods of dissecting a tumor molecularly became much more sophisticated and widespread in their use and now, today, there are 14 molecular subtypes of breast cancer. And that is the underpinning of how breast cancer has led the way [in determining that patients with breast cancer] should not be treated with a one-size-fits-all approach. They should be treated with therapeutics that are directed to the appropriate subtype or the class in which they sit.

These molecular subtype characterizations have also shaped the therapeutic strategies within different breast cancer settings. Just thinking about advances in targeted therapies and how we use them to treat breast cancer in the last decade, I separate it into 2 categories1 is how we treat localized breast cancer,when our goal is to cure the cancer, so stages I to III. Most patients are being diagnosed with those earlier stages of breast cancer, Marina Sharifi , MD, PhD, assistant professor and medical oncologist at the University of Wisconsin Carbone Cancer Center, told TTO.

I think one of the major themes over the last 10 years for these nonmetastatic breast cancers is what I refer to as right-sizing therapy. We know that some of the women who have these early breast cancers can have recurrence down the road and we want to try and prevent that. So, 10 to 15 years ago, all of those women got chemotherapy, but even back then we knew that not every woman needs chemotherapy, and we knew that there were some breast cancers that could potentially benefit from more targeted types of therapies. But in the past 10 years, there have been a few developments that have allowed us to determine which women need chemotherapy and which women we can safely avoid exposing to the [adverse] effects of chemotherapy, said Sharifi.

This new prognostic ability has been fueled by advances in genomic testing.5,6 In addition to hormone receptors and molecular subtypes, other prognostic biomarkers that have been incorporated into practice include transcriptomic and proteomic levels and Ki-67 levels. Other biomarkers utilize combinations of genes to determine potential responses to treatment as well as the possibility of recurrence.

And more recently, research has turned to the use of circulating DNA and circulating tumor cells to help identify further prognostic and predictive bbiomarkers for patients with breast cancer.6

Specifically, for estrogen-driven (estrogen receptor [ER] positive) breast cancers, which are the most common type of breast cancer, we have genomic tests that are now used routinely to help us identify women who can safely avoid chemotherapy with that type of breast cancer. Both the MammaPrint and the OncoType DX are genomic tests that we know are effective in identifying which women do need chemotherapy to help maximize their chances of cure and which women have lower-risk breast cancers where the chemotherapy actually wont help them because they dont need it.7,8 That has been a huge development in the fi eld in the last 10 yearsto go from knowing that these tests were out there but not having that confirmation that we know that they predict chemotherapy benefit to having 2 major trials come out in the last 10 years that demonstrate that they can predict chemotherapy benefit, both in women who have those ER-positive breast cancers without lymph node involvement and also women who have ER-positive breast cancer with lymph node involvement. That has been a major advance for the most common type of breast cancer thats diagnosed across the country.9,10

Both the TAILORx (NCT00310180) and RxPONDER (NCT01272037) trials validated the usefulness of the 21-gene Oncotype DX recurrence score assay in patients with hormone receptorpositive, HER2-negative breast cancer. The TAILORx trial showed that among patients with node-negative disease, those with an intermediate Oncotype DX score, or intermediate risk of recurrence, could benefit from treatment with endocrine therapy alone and avoid receiving chemotherapy. Younger patients (.50 years) with a recurrence score of 16 to 25 still showed some benefit from the combination of chemotherapy and endocrine therapy.9 I n R xPONDER, adjuvant chemotherapy use was not considered necessary in most postmenopausal women with node-positive disease and recurrence scores between 0 and 25. Alternatively, premenopausal women were more likely to benefit from adjuvant chemotherapy.10

On the fl ip side, said Sharifi , somehave high-risk TNBC or high-risk HER2-positive breast cancer, those are types of breast cancer where historically we have struggled to cure women. There weve had a number of different advances. In TNBC, weve had the introduction of immunotherapies into our treatment. The KEYNOTE-522 trial [NCT03036488] showed that if wecombine pembrolizumab [Keytruda] with chemotherapy, that has significantly increased the number of women were able to cure of that higher-risk TNBC.11

Approval of neoadjuvant pembrolizumab in combination with chemotherapy for patients with high-risk, early-stage TNBC followed by single-agent adjuvant pembrolizumab by the FDA in 2021 was a signifi cant advancement for the treatment of patients with TNBC.12 Data from the KEYNOTE-522 trial were considered practice changing early on, showing a pathological complete response in 64.8% of patients treated with the regimen.11

Likewise, for HER2-positive breast cancer, we have seen the development of multiple drugs that target HER2, from trastuzumab [Herceptin] and pertuzumab [Perjeta], to ado-trastuzumab emtansine [T-DM1; Kadcyla], that have increased the number of women who were able to cure of their HER2-positive breast cancers, Sharifi said.

Slamon also commented on the proliferation of HER2-targeting therapies in addition to the expansion of other types of targeted agents, benefi tting patients in the TNBC space. [Since] our initial fi nding of HER2 and trastuzumab, now theres a ton of HER2 targeting trastuzumab deruxtecan [Enhertu] and emtansine [Kadcyla], margetuximab [Margenza]the list goes on and on of anti- HER2 therapeutics. Then there are new therapeutics for TNBC; they look at the TROP-2 target on tumor cells, and sacituzumab govitecan [Trodelvy] is the new therapeutic for that.13 As we identify new targets that we can approach with an antibody thatll attach to it, [it could be possible to] make an antibody- drug conjugate [ADC] to allow that antibody to go right to the target protein on the tumor cell and have it released internally and that takes away the systemic effect of the chemotherapy and delivers it right into the cell. Thats a whole new strategy thats coming into its own in a big way now, Slamon told TTO.

The phase 3 ASCENT study (NCT02574455) showed that sacituzumab produced a PFS and overall survival (OS) benefi t over physicians choice of chemotherapy in patients with relapsed or refractory metastatic TNBC. The median PFS with sacituzumab was 5.6 months compared with 1.7 months with chemotherapy. Median OS was 12.1 months with the ADC and 6.7 months with chemotherapy.13

The emergence of these newer targeted therapies has permitted a risk-based tailoring of neoadjuvant and adjuvant therapies in the non-metastatic breast cancer space, observed Sharifi . Another major development over the last 10 years, particularly for the [patients with] TNBC and HER2-positive breast cancers, is a shift towards neoadjuvant chemotherapy, which allows us to identify women with higher risk of recurrence after our standard pre-operative chemotherapy, and then add additional therapy after surgery to reduce their risk. For instance, that is how ado-trastuzumab emtansine is used in HER2-positive breast cancer, and there are other targeted options in this space, including olaparib [Lynparza] for women with germline BRCA mutations, she said.

We have also made great strides in precision oncology in the metastatic breast cancer space, with an expansion of different types of targeted approaches, including mutation-targeted inhibitors, immunotherapy, and ADCs. While all of these developments have helped patients live longer and better with metastatic breast cancer, I think ADCs are the most game-changing new development for treating metastatic breast cancer, Sharifi told TTO. As an example, the ADC trastuzumab deruxtecan, is a HER2-targeting agent [encompassing] trastuzumab linked to a chemotherapy that was initially found to be extremely effective for HER2-positive metastatic breast cancer, even in women who have had multiple prior treatments with different other agents. Even more importantly, however, it has recently been shown to be effective also in women who have low HER2 expression, who would previously have been classifi ed as HER2 negative.14 This has dramatically expanded the group of women with metastatic breast cancer who can benefit from trastuzumab deruxtecan to include what we are now calling HER2-low breast cancers, which are far more common than HER2-positive breast cancers. So thats been an important advance for us in the ADC space just in the last year, said Sharifi.

Data from the phase 3 DESTINY-Breast04 trial (NCT03734029) showed that patients with low HER2 expression can still possibly benefit from HER2-targeted therapy. The trial demonstrated a median progression-free survival (PFS) of 10.1 months with trastuzumab deruxtecan therapy vs 5.4 months with physicians choice of therapy in patients with HER2-low (IHC 1+/IHC 2+, ISH-) metastatic breast cancer who had received 1 to 2 prior lines of chemotherapy. The median OS was 23.9 months with trastuzumab deruxtecan and 17.5 months with physicians choice of chemotherapy.14 These findings led to the FDA approval of trastuzumab deruxtecan in this disease setting just this year.15

The Importance of Individualization

Turning to mutation-targeted therapies, this has also been an active area in metastatic breast cancer treatment in the past 5 years, including the first FDA approval of a drug targeting PIK3CA mutations, [which] are common in many types of cancer and found in almost half of women who have ER-positive metastatic breast cancer, where the drug alpelisib [Piqray] has been approved for women with this type of mutation, Sharifi told TTO.

Approval for alpelisib in breast cancer was supported by fi ndings from the phase 3 SOLAR-1 trial (NCT02437318), which showed that the PI3K inhibitor in combination with fulvestrant led to a median PFS of 11.0 months vs 5.7 months with fulvestrant in patients with PIK3CA-mutant, HR-positive, HER2-negative advanced breast cancer.16

patient with metastatic breast cancer should be getting molecular profi ling to identify possible targeted therapy options, and many patients will now have ADC treatment options that they may be eligible for at some point in their disease trajectory. For patients with localized breast cancer, I think weve also come a long way in being able to individualize therapy and avoid exposing patients to unnecessary [adverse] effects while also being able to augment treatment for patients who are at higher risk of recurrence and cure more women with this diagnosis, Sharifi said.

The basis of this personalized therapy derived from breast cancer-based research, observed Slamon. The gamechanger clearly was [the molecular advancements]. [When] looking at what is big in oncology, its this appreciation that originated in breast cancer and now has spread throughout the field of human oncology about this molecular diversity defining, a) different subtypes, and b) new potential therapeutic targets or pathways, he said.

Sharifi looks to the continued development of ADCs as a cancer treatment modality. Theres a real untapped well of potential targets that were just starting to explore in terms of developing new ADCs and combining them with targeted and immunotherapy approaches, and I think this will move the bar in how were able to combat treatment resistance, said Sharifi.

Slamons view of the future also comprises targeted strategies : As we identify more targets...therell probably be more and newer, perhaps even better, therapeutics than we have currently. Breast cancer has led this field.

REFERENCES:

1. Bettaieb A, Paul C, Plenchette S, Shan J, Chouchane L, Ghiringhelli F. Precision medicine in breast cancer: reality or utopia? J Transl Med. 2017;15(1):139. doi:10.1186/s12967-017-1239-z

2. Cocco S, Piezzo M, Calabrese A, et al. Biomarkers in triple-negative breast cancer: state-of-the-art and future perspectives. Int J Mol Sci. 2020;21(13):4579. doi:10.3390/ijms21134579

3. Low SK, Zembutsu H, Nakamura Y. Breast cancer: The translation of big genomic data to cancer precision medicine. Cancer Sci. 2018;109(3):497-506. doi:10.1111/cas.13463

4. Beatson GT. On the treatment of inoperable cases of carcinoma of the mamma: suggestions for a new method of treatment, with illustrative cases. Trans Med Chir Soc Edinb. 1896;15:153-179.

5. Hou Y, Peng Y, Li Z. Update on prognostic and predictive biomarkers of breast cancer. Semin Diagn Pathol. 2022;39(5):322-332. doi:10.1053/j.semdp.2022.06.015

6. Nicolini A, Ferrari P, Duff y MJ. Prognostic and predictive biomarkers in breast cancer: past, present and future. Semin Cancer Biol. 2018;52(Pt 1):56-73. doi:10.1016/j.semcancer.2017.08.010

7. Cardoso F, vant Veer LJ, Bogaerts J, et al; MINDACT Investigators. 70- gene signature as an aid to treatment decisions in early-stage breast cancer. N Engl J Med. 2016;375(8):717-729. doi:10.1056/NEJMoa1602253

Go here to see the original:
A Decade of Breast Cancer at the Molecular Level: Pioneering Personalized Medicine - Targeted Oncology

Unspinning the secrets of spider webs – Australian Geographic – Australian Geographic

Home News Unspinning the secrets of spider webs

By Esme MathisOctober 19, 2022

Stronger than steel and more elastic than rubber, spider silk has the potential to transform medicine, engineering, and materials science if only we learn how to produce it.

A new global study, involving University of New South Wales scientists, has analysed the silk properties of spiders across Oceania, Asia, Europe and the USA to better understand how this natural wonder can be emulated in future biomaterials. The research, published in Science Advances, catalogued the silk gene sequences of 1098 species from 76 families.

Up until now, there was a pretty good literature set of how spider silk performs, says Dr Sean Blamires, an evolutionary ecological biologist from UNSW Sydneys School of Biological, Environmental and Earth Sciences. But what has been lacking is a way to generalise across spiders and find out what causes specific properties. Is there a link between genes, protein structures and fibres?

According to Sean, the large data set collected over five years allow scientists to create complex models, using machine learning to understand how and why specific silk properties vary between species, and even between individual spiders.

Just like the Human Genome project has given researchers the ability to identify specific gene sequence mutations that cause specific diseases, this database gives biologists and material scientists the ability to derive direct genetic causes for the properties of spider silk, he says.

There are seven types of spider silk, secreted from different glands within a spider. Out of these, dragline silk is the crowning glory. Known for its strength, durability and flexibility, dragline silk has captured scientists imagination for decades with its tantalising potential.

In a spiderweb, the dragline silk makes up the framework and the radials. Its also the silk that the spider uses when it drops off a web, says Sean. Non-web building spiders might use it to make retreats or use it for signalling with each other, while trapdoor spiders use something very similar.

In Australia, the dragline silk produced by orb-weaving spiders is so tough that it outperforms Kevlar and steels. Its tough, but also flexible.

Most materials are either one or the other, says Sean.

The study measured the mechanical, thermal, structural and hydration properties of dragline silks.

Its hoped this research will provide a blueprint for renewable, biodegradable and sustainable biopolymers. Suggested uses for this lightweight material ranges from bulletproof vests, flexible building materials, biodegradable bottles, and even a non-toxic biomaterial in regenerative medicine, that can be used as a scaffold to grow and repair damaged nerves or tissues.

View original post here:
Unspinning the secrets of spider webs - Australian Geographic - Australian Geographic

A sound approach for effective gene therapy delivery to brain – The Source – Washington University in St. Louis – Washington University in St. Louis

Researchers have been experimenting with different ways to deliver genes to the brain to treat central nervous system diseases and tumors. One of the obstacles, however, is the ability to penetrate the blood-brain barrier while having minimal effect on the other organs in the body.

Hong Chen, associate professor of biomedical engineering at the McKelvey School of Engineering and of radiation oncology at the School of Medicine, both at Washington University in St. Louis, and her team found an effective method to overcome that obstacle using focused ultrasound intranasal delivery (FUSIN). In new research, they found that the intranasally delivered gene therapy had comparable or better outcomes than existing methods while having minimal effect on the bodys other organs.

Results of the research, led by Chen and Dezhuang Ye, a postdoctoral research associate, and collaborators, were published online in the journal eBioMedicineSept. 21. It is the first study to evaluate the potential of FUSIN to deliver adeno-associated viral vectors, small viruses used to deliver gene therapy, in a mouse model.

Read more on the engineering website.

Visit link:
A sound approach for effective gene therapy delivery to brain - The Source - Washington University in St. Louis - Washington University in St. Louis

NIST Cloud Computing Program – NCCP | NIST

Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction. This cloud model promotes availability and is composed of five essential characteristics (On-demand self-service, Broad network access, Resource pooling, Rapid elasticity, Measured Service); three service models (Cloud Software as a Service (SaaS), Cloud Platform as a Service (PaaS), Cloud Infrastructure as a Service (IaaS)); and, four deployment models (Private cloud, Community cloud, Public cloud, Hybrid cloud). Key enabling technologies include: (1) fast wide-area networks, (2) powerful, inexpensive server computers, and (3) high-performance virtualization for commodity hardware.

The Cloud Computing model offers the promise of massive cost savings combined with increased IT agility. It is considered critical that government and industry begin adoption of this technology in response to difficult economic constraints. However, cloud computing technology challenges many traditional approaches to datacenter and enterprise application design and management. Cloud computing is currently being used; however, security, interoperability, and portability are cited as major barriers to broader adoption.

The long term goal is to provide thought leadership and guidance around the cloud computing paradigm to catalyze its use within industry and government. NIST aims to shorten the adoption cycle, which will enable near-term cost savings and increased ability to quickly create and deploy enterprise applications. NIST aims to foster cloud computing systems and practices that support interoperability, portability, and security requirements that are appropriate and achievable for important usage scenarios

Read the original here:

NIST Cloud Computing Program - NCCP | NIST

Alibaba Invests in Cloud Computing Business With New Campus – ETF Trends

Chinese tech giant Alibaba Group Holding Ltd. has opened a new campus for its cloud computing unit, Alibaba Cloud, in its home city of Hangzhou. Per the South China Morning Post, which Alibaba owns, the 10-building, 2.1 million-square-foot campus is roughly the size of the 2 million-square-foot campus for Googles Silicon Valley headquarters, aka the Googleplex, in Mountain View, California.

Alibaba Cloud also highlighted the campuss eco-friendly designs in a video, including a photovoltaic power generation system, flowerpots made from recycled plastic, and high-efficiency, low-energy devices in the on-site coffee shop, according to SCMP.

The new campus signals the firms commitment to investing in its growing cloud computing business. While Alibabas net income dropped 50% year-over-year in the second quarter to 22.74 billion yuan ($3.4 billion), Alibaba Cloud experienced the fastest growth among all of Alibabas business segments in Q2, making up 9% of total revenue.

The new facilities also come at a time when Chinas economy has been facing a slowdown. While Chinas economy is slowing down, Alibabas cloud computing unit has been eyeing expansion opportunities overseas. For example, Alibaba Cloud announced last month a $1 billion commitment to upgrading its global partner ecosystem.

Alibaba is currently thethird-largest holding in EMQQ Globals flagship exchange-traded fund, the Emerging Markets Internet & Ecommerce ETF (NYSEArca: EMQQ) with a weighting of 7.01% as of October 14. EMQQ seeks to offer investors exposure to the growth in internet and e-commerce activities in the developing world as middle classes expand and affordable smartphones provide unprecedentedly large swaths of the population with access to the internet for the first time, according to the issuer.

EMQQ tracks an index of leading internet and e-commerce companies that includes online retail, search engines, social networking, online video, e-payments, online gaming, and online travel.

For more news, information, and strategy, visit ourEmerging Markets Channel.

Read more from the original source:

Alibaba Invests in Cloud Computing Business With New Campus - ETF Trends

The Top 5 Cloud Computing Trends In 2023 – Forbes

The ongoing mass adoption of cloud computing has been a key driver of many of the most transformative tech trends, including artificial intelligence (AI), the internet of things (IoT), and remote and hybrid working. Going forward, we can expect to see it becoming an enabler of even more technologies, including virtual and augmented reality (VR/AR), the metaverse, cloud gaming, and even quantum computing.

The Top 5 Cloud Computing Trends In 2023

Cloud computing makes this possible by removing the need to invest in buying and owning the expensive infrastructure required for these intensive computing applications. Instead, cloud service providers make it available "as-a-service," running on their own servers and data centers. It also means companies can, to some extent, avoid the hassle of hiring or training a highly specialized workforce if they want to take advantage of these breakthrough technologies.

In 2023 we can expect to see companies continuing to leverage cloud services in order to access new and innovative technologies as well as drive efficiencies in their own operations and processes. Heres a rundown of some of the trends that I believe will have the most impact.

Increased investment in cloud security and resilience

Migrating to the cloud brings huge opportunities, efficiencies, and convenience but also exposes companies and organizations to a new range of cybersecurity threats. On top of this, the growing pile of legislation around how businesses can store and use personal data means that the risk of fines or (even worse) losing the trust of their customers is a real problem.

As a result, spending on cyber security and building resilience against everything from data loss to the impact of a pandemic on global business will become even more of a priority during the coming year. However, as many companies look to cut costs in the face of a forecasted economic recession, the emphasis is likely to be on the search for innovative and cost-efficient ways of maintaining cyber security in order to get the most "bang for the buck." This will mean greater use of AI and predictive technology designed to spot threats before they cause problems, as well as an increase in the use of managed security-as-a-service providers in 2023.

Multi-cloud is an increasingly popular strategy

If 2022 was the year of hybrid cloud, then 2023 could be the year that businesses come to understand the advantages of diversifying their services across a number of cloud providers. This is a strategy known as taking a multi-cloud approach, and it offers a number of advantages, including improved flexibility and security.

It also prevents organizations from becoming too tied in to one particular ecosystem - a situation that can create challenges when cloud service providers change the applications they support or stop supporting particular applications altogether. And it helps to create redundancy that reduces the chance of system errors or downtime from causing a critical failure of business operations.

Adopting a multi-cloud infrastructure means moving away from potentially damaging business strategies such as building applications and processes solely around one particular cloud platform, e.g., AWS, Google Cloud, or Microsoft Azure. The growing popularity of containerized applications means that in the event of changes to service levels, or more cost-efficient solutions becoming available from different providers, applications can be quickly ported across to new platforms. While back in 2020, most companies (70 percent) said they were still tied to one cloud service provider, reports have found that 84% of mid-to-large companies will have adopted a multi-cloud strategy by 2023, positioning it as one of the years defining trends in cloud computing.

The AI and ML-powered cloud

Artificial intelligence (AI) and machine learning (ML) are provided as cloud services because few businesses have the resources to build their own AI infrastructure. Gathering data and training algorithms require huge amounts of computing power and storage space that is generally more cost-efficient to rent as-a-service. Cloud service providers are increasingly relying on AI themselves for a number of tasks. This includes managing the vast, distributed networks needed to provide storage resources to their customers, regulating the power and cooling systems in data centers, and powering cyber security solutions that keep their data safe. In 2023, we can expect to see continued innovation in this field as hyper scale cloud service providers like Amazon, Google, and Microsoft continue to apply their own AI technology to create more efficient and cost-effective cloud services for their customers.

Low-code and no-code cloud services

Tools and platforms that allow anybody to create applications and to use data to solve problems without getting their hands dirty with writing computer code are increasingly popular. This category of low-code and no-code solutions includes tools for building websites, web applications and designing just about any kind of digital solutions that companies may need. Low-code and no-code solutions are even becoming available for creating AI-powered applications, drastically lowering the barriers to entry for companies wanting to leverage AI and ML. Many of these services are provided via the cloud meaning users can access them as-a-service without having to own the powerful computing infrastructure needed to run them themselves. Tools like Figma, Airtable, and Zoho allow users to carry out tasks that previously would have required coding experience, such as designing websites, automating spreadsheet tasks, and building web applications, and I see providing services like this as an area where cloud technology will become increasingly useful in 2023 and beyond.

Innovation and consolidation in cloud gaming

The cloud has brought us streaming services like Netflix and Spotify, which have revolutionized the way we consume movies, TV, and music. Streaming video gaming is taking a little longer to gain a foothold but is clearly on its way, with Microsoft, Sony, Nvidia, and Amazon all offering services in this field. It hasnt all been plain sailing, however Google spent millions of dollars developing their Stadia streaming gaming service only to retire it this year due to lack of commercial success. One of the problems is the networks themselves streaming video games clearly requires higher bandwidth than music or videos, meaning it's restricted to those of us with good quality high-speed internet access, which is still far from all of us. However, the ongoing rollout of 5G and other ultra-fast networking technologies should eventually solve this problem, and 2023 could be the year that cloud gaming will make an impact. Google themselves have said that the technology itself that powers Stadia will live on as the backbone of an in-development B2B game streaming service that will allow game developers to provide streaming functionality directly to their customers. If, as many predict, cloud gaming will become the killer app for 5G in the same way that streaming video was for 4G and streaming music was for 3G, then 2023 could be the year when we start to see things fall into place.

To stay on top of the latest on the latest business and tech trends, make sure to subscribe to my newsletter, follow me on Twitter, LinkedIn, and YouTube, and check out my books Tech Trends in Practice and Business Trends in Practice, which just won the 2022 Business Book of the Year award.

Visit link:

The Top 5 Cloud Computing Trends In 2023 - Forbes

Benefits of Cloud Computing That Can Help You With Your Business 2023 – ReadWrite

Today, businesses are looking to operate more flexibly and cost-effectively. This has led to the rise of cloud computing as a viable solution for almost every business. Cloud computing uses a network of remote servers hosted on the Internet and accessible through standard web browsers or mobile apps.

It enables users to store data remotely, exchange files, and access software applications from anywhere with an internet connection. In addition, individuals and businesses that use the cloud can access their data from any computer or device connected to the internet, allowing them to sync their settings and files wherever they go.

There are many advantages to using cloud services for your business. Here are 10 benefits of cloud computing that can help you with your business.

One of the most significant benefits of cloud computing is its security. If you run your business on the cloud, you dont have to worry about protecting your data from hackers or other threats.

Cloud providers use industry-standard security practices to keep your data safe, including firewalls, encryption, and authentication systems.

You can further customize your businesss security settings if your business uses a private cloud. For example, if an employee loses or misplaces a device that has access to your data, you can remotely disable that device without putting your data at risk.

You can also encrypt your data to protect it against cyber threats. Businesses can also use multi-factor authentication (MFA) to protect their data further. MFA requires users to input a one-time passcode sent to their phone to log in and confirm their identity.

Another advantage of cloud computing is its scalability. Cloud providers offer scalable cloud solutions that you can adjust to meet your businesss needs.

You can scale up or down your system on demand to deal with seasonal traffic or unexpected spikes in usage. This allows you to avoid buying too much computing power and resources upfront and allows your business to adjust to changes in demand quickly.

You can also try out a cloud solution before you commit to it by renting a smaller instance for a trial period. Cloud solutions are also flexible enough for you to upgrade or downgrade your solutions as your business scales up or down.

This means that you dont have to buy more computing power than you need upfront, and you dont have to upgrade your systems again if your business starts to slow down.

Cloudcomputing can help you achieve greater flexibility and mobility if your business relies on people working remotely. With cloud solutions, you can access your data and run your applications from any computer or device connected to the internet.

When you can access all your data from anywhere, employees can work from home, in coffee shops, or other locations without sacrificing productivity. In addition, cloud providers offer a wide range of collaboration and communication tools that work with their services.

You can also use these tools to collaborate and communicate with clients and vendors who dont need access to your companys data.

Another advantage of cloud computing is its consistency. While different people and departments may use other devices and software, cloud solutions ensure everyone has a consistent experience.

This prevents miscommunications and ensures that everyone is on the same page. Whether you use Office 365, Google G Suite, Salesforce, or another cloud service, your business will have a consistent experience across platforms.

You can also use tools, like identity integration, to access information from different applications without switching between them.

Cloud solutions offer significant cost reductions over the long run compared to other IT solutions. You can save money on hardware, upgrades, and software licenses while enjoying a flexible and scalable solution.

Cloud providers handle all the maintenance and upgrade of their systems, so you dont have to worry about keeping up with the latest trends in IT.

Cloud solutions offer significant cost reductions over the long run compared to other IT solutions. You can save money on hardware, upgrades, and software licenses while enjoying a flexible and scalable solution.

You can easily integrate multiple cloud services to streamline your workflows if your business uses various cloud services.

Many cloud services have a wide range of integrations with other services that you can use to enhance your business processes. For example, you can use Salesforce to manage your leads and close rates and Zapier to link it with other business tools like Gmail, Mailchimp, and Google Calendar.

You can also use a hybrid cloud solution that lets you keep your data close to home while accessing additional IT services through the cloud.

Cloud solutions offer unlimited storage, unlike other data storage solutions like on-premise computers. So while you can scale down your cloud solution if you dont need as much storage for your data, you can also increase your storage later.

You can also use a hybrid solution to keep some of your data local while storing other data in the cloud.

Another advantage of cloud computing is faster performance. In addition, if you use the cloud, you arent limited by your hardware, and your systems are more scalable.

This means that your website and other business applications will perform faster without you having to make hardware upgrades.

You can also use a hybrid solution to improve your performance by keeping your most critical data close to home while accessing other data in the cloud.

Cloud solutions offer a collaborative online environment that lets you share important information with clients and vendors. You can use collaboration tools like wikis, blogs, and forums to work with team members and manage your projects.

You can also use collaboration tools to communicate with clients and vendors who dont need access to your companys data. These tools let you share documents, collaborate on tasks, and manage your workflow from a single platform.

Even though control is an essential aspect of a companys success, there are certain things that you cannot control. Whether your organization controls its own procedures or not, there are certain things that are out of your control. In todays market, even a small amount of downtime has a significant impact.

Business downtime leads to lost productivity, revenue, and reputation. Although you cant prevent or foresee all the catastrophes, there is something you can do to speed up your recovery. Cloud-based data recovery services provide quick recovery for emergency situations, such as natural disasters and electrical outages.

The range of benefits of Cloud computingmakes it a viable solution for almost every business. It offers many advantages that can help you streamline your workflow, achieve better performance, and operate more efficiently.

Suvigya Saxena is the Founder & CEO of Exato Software, a global ranking mobile, cloud computing and web app development company. With 15+ years of experience of IT Tech which are now global leaders with creative solutions, he is differentiated by out-of-the-box IT solutions throughout the domain.

Go here to see the original:

Benefits of Cloud Computing That Can Help You With Your Business 2023 - ReadWrite

cloud-computing GitHub Topics GitHub

cloud-computing GitHub Topics GitHub Here are 1,464 public repositories matching this topic...

Learn and understand Docker&Container technologies, with real DevOps practice!

A curated list of software and architecture related design patterns.

Pulumi - Universal Infrastructure as Code. Your Cloud, Your Language, Your Way

High-Performance server for NATS.io, the cloud and edge native messaging system.

A curated list of Microservice Architecture related principles and technologies.

Cloud Native application framework for .NET

A curated list of awesome services, solutions and resources for serverless / nobackend applications.

Cloud Native Control Planes

A comprehensive tutorial on getting started with Docker!

Rules engine for cloud security, cost optimization, and governance, DSL in yaml for policies to query, filter, and take actions on resources

Service Fabric is a distributed systems platform for packaging, deploying, and managing stateless and stateful distributed applications and containers at large scale.

Open, Multi-Cloud, Multi-Cluster Kubernetes Orchestration

The open-source data integration platform for security and infrastructure teams

Web-based Cloud Gaming service for Retro Game

A list of resources in different fields of Computer Science

Example of a cinema microservice

This repository consists of the code samples, assignments, and notes for the DevOps bootcamp of Community Classroom.

Awesome Cloud Security Resources

Source code accompanying book: Data Science on the Google Cloud Platform, Valliappa Lakshmanan, O'Reilly 2017

TFHE: Fast Fully Homomorphic Encryption Library over the Torus

Add a description, image, and links to the cloud-computing topic page so that developers can more easily learn about it.

Curate this topic

To associate your repository with the cloud-computing topic, visit your repo's landing page and select "manage topics."

Learn more

Original post:

cloud-computing GitHub Topics GitHub

Cloud Computing: A catalyst for the IoT Industry – SiliconIndia

Cloud computing is a great enabler for todays businesses for a variety of reasons. It helps companies, particularly small and medium enterprises jumpstart their operations sooner as there is very little lead time needed to stand up a full-fledged in-house IT infrastructure. Secondly, it eases the financial requirements by avoiding heavy capex and turning the IT costs into an opex model. Even more advantageous is the opex costs can be scaled up and down dynamically based on demand thus optimizing IT costs.

I think Cloud computing became a catalyst for the IoT industry and the proliferation that is seen today probably may not have happened in the absence of Cloud integration. Typically, IoT devices like sensors generate huge amounts of data that require both storage and processing thus making Cloud platforms the perfect choice for building IoT-based solutions. In an IoT implementation, apart from data assimilation there are some fundamental aspects like security, managing devices, etc. that needs to be considered and Cloud platforms take over some of these implementation aspects enabling the solution provider to focus on the core problem.

An interesting case study of how IoT and Cloud technologies can help to create innovative solutions was presented in a Microsoft conference few years back. Its a solution developed to monitor the pollution levels in Ganges which is a project sponsored by Central Pollution Control Board. For more information, readers could go to this link https://azure.microsoft.com/en-us/blog/cleaning-up-the-ganges-river-with-help-from-iot/

Digital technology in the financial services

When we talk about disruptive digital technologies in Financial Services industry, perhaps Blockchain is the one that stands out immediately. The concept of DLT (Decentralised Ledger Technology) has been around for some time and theres lots of interest in leveraging this technology primarily for transparency and efficiency reasons. After an article by Reserve Bank of India in 2020, many Indian banks responded to this initiative by starting to look at opportunities that involve DLT. For e.g. State Bank of India tied up with JP Morgan to use their Blockchain technology.

Adoption of Blockchain could simplify Inter-bank payment settlement and perhaps could be extended in future to cross-border payment settlements across different DLT platforms. It could also be used for settlement of securitized assets by putting them on a common ledger. Another application is using DLT for KYC whereby multiple agencies (like banks) can access customer data from a decentralized and secure database. In fact, EQ uses Blockchain in its product offering to privately funded companies and PEs for Cap table management.

The next one is probably Artificial Intelligence (AI) and Machine Learning (ML) which is predominantly being applied in Financial Services industry in managing internal and external risks. AI-based algorithms now underpin risk-based pricing in Insurance sector and in reducing NPAs in the Banking sector. The technology helps banks predict defaults and take proactive measures to mitigate that risk.

In the Indian context, Unified Payments Interface (UPI) and Aadhar-enabled Payment Service (AePS) are classic examples of disruptive products in financial services industry.

Effective Network Security acts as a gatekeeper

In todays connected world where much of the commerce happens online, its imperative businesses focus on security to safeguard them from threats in cyberspace. The recent approach to Network security is Zero Trust model which basically means never trusts any user/device unless verified. In this model, mutual authentication happens between the two entities in multiple ways, for e.g. using User credentials followed by a second factor like an OTP and sometimes application authentication happens through a digital certificate. The process also uses analytics and log analysis to detect abnormalities in user behaviour and enforce additional authenticating measures while sending alerts at the same time. This is something many of us might have come across when we try to connect to an application from a new device that the application is not aware of. The security mechanism might enforce additional authentication whilst sending an alert to us. Nowadays, businesses also use innovative methods of authentication like biometrics, voice recognition, etc. and some of these are powered by AI/ML.

Fintech players leverage Artificial Intelligence to bridge the gap in MSME lending

I think MSME lending (maybe Retail Lending too) is one of the segments significantly disrupted by technology. In a way, it has opened unconventional options for MSMEs to attract capital both for capex and working capital requirements. There are products ranging from P2P lending to Invoice Discounting offered by Fintech companies which is opening up a new market place. There are Fintech players interested in lending in this space and they use AI/ML models to predict probability of defaults and assess credit risk and appropriately hedge against them.

See the original post here:

Cloud Computing: A catalyst for the IoT Industry - SiliconIndia

Revitalising data and infrastructure management through cloud – ETCIO South East Asia

The Cloud has been a significant contributor to the digital optimisation and transformation of businesses and institutions globally since the 2010s. It seems almost an eternity ago, when the IT department was essentially a support function, with KRAs around design and delivery of Information Technology Architecture encompassing Infrastructure, Data Centres and constituent servers, personal computers, software, networking and security systems, along with the associated, vendor evaluation, outsourcing, contracting, commissioning and finally aligning with business systems and goals, as this pre-millennium Research Gate paper indicates.

The one and a half decades since the advent of the millennium saw the rise of many trends, besides the cloud, such as shift from integrated to business specific applications, resulting data management and insights, globalisation, adoption of Infrastructure as a Service (IaaS), Platform as a Service (PaaS), implosion of telecom, mobility and Mobile Backend as a Service (MBaaS), other technologies such as social media, E-commerce, Extended Reality, Digital Twins, AI/ ML, RPA, Internet of things, Blockchain and Chatbots and lastly, the growing skill-gaps and demands of talent.

The cloud has now taken over a major chunk of responsibilities pertaining to infrastructure, data centre, hosting, SaaS and architectural applications, platforms, networking and security functions thus freeing up the IT and Business Teams for leveraging technology for more strategic tasks related to operations, customers, R&D, supply chain and others. The cloud, hence enabled companies and institutions to leverage the amalgamation of technology, people and processes across their extended enterprises to have ongoing digital programmes for driving revenue, customer satisfaction, and profitability. The cloud can potentially add USD 1 Trillion of Economic Value across the Fortune 500 band of companies by 2030, as this research by McKinsey estimates.

Even before the pandemic, although the initial adoption of Cloud was in the SaaS applications, IaaS and PaaS was surely catching up, thus shifting away from the traditional Data Centre and On-premise Infrastructure. Gartners research way back in 2015 predicted a 30 % plus increase in the IaaS spending with the public cloud IaaS workloads finally surpassing those of on-premise loads. In the same year, Gartners similar paper highlighted significant growth in PaaS as well : both for Infrastructure and Application iPaaS.

The Cloud is adding significant value across Industry verticals and business functions, right from remote working with online meetings & collaboration tools, automated factory operations, extended reality, digital twins, remote field services and many others. The cloud has also been adopted as the platform for deploying other new technologies such as RPA and Artificial Intelligence/ Machine Learning (AI/ ML) Depending on industry best practices, business use cases and IT strategies, it became feasible to leverage infrastructure, assets, applications, assets, and software, in a true Hybrid / Multi/ Industry Cloud scenario with separate private and public cloud environments covering IaaS, PaaS, SaaS and MBaaS. As Platforms were maturing, organisations were furthermore transitioning from Virtual Machine and even from IaaS based solutions to PaaS based. Gartner had predicted in this research that by 2021, over 75% of enterprises and mid-sized organisations would adopt a hybrid or multi-cloud strategy.

There was also a clear transition from the traditional lift and shift to the cloud native approach, which makes full use of cloud elasticity and optimisation levers and moreover minimises technical debt and inefficiencies. This approach makes use of cloud computing to build and run microservices based scalable applications running in virtualised containers, orchestrated in the Container-as-a-service applications and managed and deployed using DevOps workflows. Microservices, container management, infrastructure as a code, serverless architectures, declarative code and continuous integration and delivery (CI/CD) are the fundamental tenets of this cloud native approach. Organisations are balancing use of containerization along with leveraging the cloud hosting provider capabilities especially considering the extent of hybrid cloud, efforts and costs of container infrastructure and running commodity applications.

From the architecture standpoint, cloud-based composable architectures such as MACH- Microservices based, API-first, Cloud-native SaaS and Headless and Packaged business capabilities (PBCs) are increasingly being used in organisations for enhancing Digital Experience Platforms enabling customers, employees and supply chain with the new age Omnichannel experience. These architectures facilitate faster deployment and time to market through quick testing by sample populations and subsequent full-fledged implementations. These composable architectures help organisations in future proofing their IT investments, and improve business resilience and recovery with the ability to decouple and recouple the technology stacks. At the end of the 1st year of the pandemic, Gartner here highlighted the importance of composable architecture in its Hype Cycle of 2021 especially in business resilience and recovery during a crisis.

Intelligently deploying Serverless Computing in the architecture also enhances cloud native strategies immensely, thus enabling developers to focus on triggers and running function/ event-based computing, also resulting in more optimised cloud economics. Also, access to the cloud service providers Function-as-a-service (FaaS) and Backend-as-service (BaaS) models significantly reduce the IT environment transformation costs. This Deloitte Research illustrates the advantages that Serverless computing can bring about to retail operations.

To enhance their cloud native strategies, to further encourage citizen development, reducing over reliance on IT and bridging the IT-Business Gap, organisations are also making use of Low Code No Code (LCNC) tools and assembling, clearly shifting from application development. Citizen developers are making use of LCNC functionalities such as Drag and drop, pre-built User Interfaces, APIs and Connectors, one-click delivery and others to further augment their containerisation and microservices strategies. This Gartner Research predicts that 70% of new applications developed by organisations will use LCNC by 2025, well up from less than 25% in 2020.

Infrastructure and Data Management in the cloud are being immensely powered up by Automation and Orchestration. Especially in minimising manual efforts and errors in processes such as provisioning, configuring, sizing and auto-scaling, asset tagging, clustering and load balancing, performance monitoring, deploying, DevOps and CI/ CD testing and performance management. Further efficiencies are brought to fruition through automation especially in areas such as shutting down unutilised instances, backups, workflow version control, and establishing Infrastructure as Code (IAC). This further hence value-adds to robust cloud native architecture by enhancing containerisation, clustering, network configuration, storage connectivity, load balancing and managing the workload lifecycle, besides highlighting vulnerabilities and risks. Enterprises pursuing hybrid cloud strategies are hence driving automation in private clouds as well as integrating with public clouds by creating automation assets that perform resource codification across all private and public clouds and offer a single API. This McKinsey research highlights that companies that have adopted end-to-end automation in their cloud platforms and initiatives report a 20-40% increase in speed of releasing new capabilities to market. A similar report by Deloitte, mentions that intelligent automation in the cloud enables scale in just 4-12 months, compared to the earlier 6-24 months period, through streamlined development and deployment processes.

CIOs are also increasingly turning to distributed cloud models to address edge or location-based cloud use cases especially across Banks and Financial Institutions, Healthcare, Smart cities, and Manufacturing. It is expected that decentralised and distributed cloud computing will move from the initial private cloud substation deployments to an eventually Wi-Fi like distributed cloud substation ecosystems, especially considering the necessary availability, bandwidth and other operational and security aspects.

These rapid developments in the cloud ecosystems especially for hybrid and multi cloud environments have necessitated infrastructure and data management to encompass dashboards for end-to-end visibility of all the cloud resources and usage across providers, business functions, and departments. Governance and Compliance, Monitoring, Inventory Management, Patches and Version Control, Disaster Recovery, Hybrid Cloud Management Platforms (HCMP), Cloud Service Brokers (CSB) and other Tools aid companies in better Infrastructure Management in the Cloud, while catering to fluctuating demands and corresponding under and over utilisation scenarios, while continuously identifying pockets for optimisation and corrections. For companies with customers spread across diverse geographies, it is important to have tools for infrastructure management, global analytics, database engines and application architectures across these global Kubernetes clusters and Virtual Machines.

The vast increase in attack surfaces and potential breach points have necessitated CIOs and CISOs to incorporating robust security principles and tools within the Cloud Native ecosystem itself, through Cloud Security Platforms such as Cloud Access Security Broker (CASB), Cloud security posture management (CSPM), Secure Access Service Edge (SASE), DevSecOps and incorporation of AI and ML in their proactive threat hunting and response systems. This is also critical in adhering to Governance, Risk and Compliance (GRC) and Regulatory compliances, in with Zero Trust Architecture and Cyber Resilient frameworks and strategy. This McKinsey article highlights the importance of Security as Code (SaC) in cloud native strategies and its reliance on architecture and the right automation capabilities.

This EY article highlights the importance of cybersecurity in cloud native strategies as well as the corresponding considerations in processes, cyber security tools, architecture, risk management, skills and competencies and controls. Data Encryption and Load Protection, Identity and Access management, Extended Threat and Response Systems (XDR), Security Incident and Environment Management (SIEM), and Security Orchestration and Response (SOAR) tools that incorporate AI/ ML capabilities ensure more of proactive vis--vis a reactive response. Considering the vast information to ingest, store and analyse, organisations are also considering/ deploying Cyber Data Lakes as either alternatives or in conjunction to complement their SIEM ecosystems.

There is an increasing popularity of Financial Operations (FinOps) which is helping organisations to gain the maximum value from the cloud, through the cross functional involvement of business, finance, procurement, supply chain, engineering, DevOps/ DevSecOps and cloud operational teams. Augmented FinOps has been listed by Gartner in the 2022 Hype Cycle for emerging technologies here. FinOps immensely value-adds to infrastructure and data management in the cloud through dynamic and continuous sourcing and managing cloud consumption, demand mapping, crystallising the Total Cost of Ownership and Operations with end-to-end cost visibility and forecasting to make joint decisions and monitor comprehensive KPIs. Besides the cloud infrastructure management strategies listed in this section, FinOps also incorporates vendor management strategies and leveraging cloud carbon footprint tools for their organisations Sustainability Goals.

What about Data Management through the Cloud?

The 2nd half of the 2010s and especially the COVID-19 period have also resulted in an implosion of IoT, social media, E-commerce and other Digital Transformation. This has made organisations deal with diverse data sources residing on cloud, on-premise and on the edge, diversity in data sets across sensor, text, image, Audio-Visual, Voice, E-commerce, social media and others, and the volume of data that is now required to be ingested, managed and delivered on real-time and batch mode. Even before the pandemic, this implosion of unstructured data, necessitated companies to leverage Hadoop and other Open Source based Data Lakes besides their structured data residing in Data Warehouses. According to this Deloitte Article, for their surveyed CXOs, Data Modernisation is even a more critical aspect than cost and performance consideration for migrating to the cloud.

This research by Statista estimated the total worldwide data amount rose from 9 Zettabytes in 2013 to over 27 Zettabytes in 2021, and the prediction is this growing to well over 180 Zettabytes in 2025. Decentralised and distributed cloud computing, Web 3.0, the Metaverse and rise in Edge Computing will further contribute to this data growth.

Many organisations are looking at the Cloud as the future of data management as this article by Gartner states. As the cloud encompasses more and more data sources, this becomes more pivotal for data architects to have a deeper understanding of metadata and schema, the end-to-end data lifecycle pipeline of ingestion, cleaning, storage, analysis, delivery and visualisation, APIs, cloud automation and orchestration, Data Streaming, AI/ ML models, Analytics, Data Storage and Visualisation, as well as Governance and Security.

Data Architects are hence leveraging cloud computing in their strategies including scaling, elasticity and decoupling, ensuring high availability and optimal performance with relation to bursts and shutdowns, while optimising cost at the same time. The Data Teams are also deploying Automated and Active Cloud Data management covering classification, validation and governance with extensible and decoupling. There is also careful consideration for ensuring security for data at rest and in motion, as well as seamless data integration and sharing

It is also important to choose the right MetaData Strategy and consider options of Tiered apps with push-based services, pull-based ETL, and Event based Metadata. It is also worthwhile to stress upon the importance of having a robust Data Architecture/ DataOps culture as well, especially considering business and technology perspectives of the end-to-end data lifecycle right from data sources and ingestion, meta data and active data management, streaming, storage, analytics and visualisation. Deploying elasticity, AI/ ML and automation bring about immense benefits to the cloud native strategies.

Considering these aspects in Data Management, organisations have looking at ML and API powered Data Fabrics along with Data Lakes, Warehouses and Layers to manage this end-to-end data lifecycle by creating, maintaining and providing outputs to the consumers of this data, as this Gartner article on technology trends for 2022 highlights.

This article by McKinsey summarises the major pivot points in the data architecture ethos which are fundamentally based on Cloud with containerization and serverless data. These cover hybrid real time and batch data processing, shift from end-to-end COTS applications to modular best in function/ industry, move to APIs and decoupling, shift from centralised Data Warehousing to domain-based architecture and lastly from proprietary predefined datasets to data schema that is light and flexible, especially the NoSQL family.

For BFSI, Telecoms and other industry verticals which need customer data to reside locally, CXOs have been deploying Hybrid Data Management environments, that leverage Cloud Data Management tools to also automate, orchestrate, and re-use the on-premise data, thus providing a unified data model and access interface to both cloud and on-premise datasets.

Application of Automation and Orchestration in Data Storage also ensures prioritisation of processes, tasks and resources to balance speed, efficiency, usage and cost along with eliminating security vulnerabilities. This is especially applicable for tasks such as provisioning and configuration, capacity management, workflows and data migration, resource optimisation, software updates and data protection and disaster recovery. This World Economic Forum report right before the pandemic highlighted the fact that the conventional optical/ magnetic storage systems will be unable to handle this phenomenon for more than a century. CIOs and Leaders are hence leveraging automation and cloud, Storage-as-a Service (STaaS), decentralised Blockchain powered data storage and storage on the Edge, besides alternates to conventional electromagnetic/ optical data storage mechanism

What is the role of people and culture in this cloud powered data and infrastructure management ecosystem?

People, talent pool and organisation culture play a pivotal part in successful FinOps, cloud native and cloud data management strategies. In this dynamic and uncertain world, it is of paramount importance to have uniformity, alignment and resonance of business KPIs to best practices for Enterprise and Data Architecture, DevOps as well as those of Engineering, Finance, and Procurement. This environment of continuous evolution, and optimisation can be only brought about by an ethos of Communication, Trust, Change Management, Business-Finance-IT alignment, which are equally important cloud native strategies, Architecture, DevOps, DataOps, Security and other Engineering Talent Pools.

The continuing trends of the Great Resignation, Quiet Quitting and Moonlighting necessitate a combination of having the best employee and vendor engagement strategies, a readily available talent pool of architects, analysts, engineers and other skillsets, as well as upskilling.

Winding up?

The Cloud has deeply impacted and revitalised Infrastructure and Data Management in all aspects in the workplace. As per this Deloitte research, it is ideal to leverage an equal mix of people, tools and approaches to address cloud complexity, and have a powerful, agile, elastic, secure and resilient virtual business infrastructure deriving maximum value from the cloud.

Cloud-centric digital infrastructure is a bedrock in the post COVID world, aligning technology with business to support digital transformation, resilience, governance along with business outcomes through a combination of operations, technology and deployment as mentioned in this IDC paper. This is so important in the increasing complexity of todays world across Public Cloud Infrastructure, On-Premises and on the Edge.

With continuing business uncertainty, competitiveness, customer, supplier and employee pressures and stringent IT budgets, organisations are looking at the Cloud to revitalise their Infrastructure and Data Management and gain maximum value.

See the article here:

Revitalising data and infrastructure management through cloud - ETCIO South East Asia

High-Performance Computing (HPC) Market is expected to generate a revenue of USD 65.12 Billion by 2030, Globally, at 7.20% CAGR: Verified Market…

The growing demand for high-efficiency computing across a range of industries, including financial, medical, research, government, and defense, as well as geological exploration and analysis, is a significant growth driver for the HPC Market.

JERSEY CITY, N.J., Oct. 17, 2022 /PRNewswire/ -- Verified Market Research recently published a report, "High-Performance Computing (HPC) Market" By Component (Solutions, Services), By Deployment Type (On-Premise, Cloud), By Server Price Band (USD 250,000500,000 And Above, USD 250,000100,000 And Below), By Application Area (Government And Defense, Education And Research), and By Geography.

According to the extensive research done by Verified Market Research experts, the High-Performance Computing (HPC) Market size was valued at USD 34.85 Billion in 2021 and is projected to reach USD 65.12 Billion by 2030, growing at a CAGR of 7.20% from 2023 to 2030.

Download PDF Brochure: https://www.verifiedmarketresearch.com/download-sample/?rid=6826

Browse in-depth TOC on "High-Performance Computing (HPC) Market"

202 - Pages126 Tables37 Figures

Global High-Performance Computing (HPC) Market Overview

High-performance computing is the use of parallel processing to efficiently, consistently, and quickly operate big software programmes. High-Performance Computing is a technique that makes use of a sizable amount of computing power to offer high-performance capabilities for resolving various issues in the fields of engineering, business, and research. HPC systems refer to all types of servers and micro-servers used for highly computational or data-intensive applications. High-performance computing systems are those that can do 1012 floating-point operations per second, or one teraflop, on a computer.

One of the main factors influencing the growth of the High-Performance Computing (HPC) Market is the ability of HPC solutions to swiftly and precisely process massive volumes of data. The increasing demand for high-efficiency computing across a range of industries, including financial, medical, research, exploration and study of the earth's crust, government, and defense, is one of the primary growth factors for the high-performance computing (HPC) market.

The rising need for high precision and quick data processing in various industries is one of the major drivers of the High-Performance Computing (HPC) Market. The market will also grow exponentially over the course of the anticipated period as a result of the growing popularity of cloud computing and the government-led digitization initiatives.

The usage of HPC in cloud computing is what is driving the worldwide high-performance computing (HPC) market. Utilizing cloud computing platforms has a number of benefits, including scalability, flexibility, and availability. Cloud HPC offers a number of benefits, including low maintenance costs, adaptability, and economies of scale.

Additionally, HPC in the cloud gives businesses that are new to high-end computing the chance to form a larger community, enabling them to profit from cheap operating expenditure (OPEX) and overcome the challenges of power and cooling. As a result, it is anticipated that the growing usage of HPC in the cloud would significantly influence the High-Performance Computing (HPC) Market.

Key Developments

Partnerships, Collaborations, and Agreements

Mergers and Acquisitions

Product Launches and Product Expansions

Key Players

The major players in the market are Advanced Micro Devices Inc., Hewlett Packard Enterprise, Intel Corporation, International Business Machines (IBM) Corporation, NEC Corporation, Sugon Information Industry Co. Ltd, Fujistu Ltd, Microsoft Corporation, Dell Technologies Inc., Dassault Systemes SE, Lenovo Group Ltd, Amazon Web Series, and NVIDIA Corporation.

Verified Market Research has segmented the Global High-Performance Computing (HPC) Market On the basis of Component, Deployment Type, Server Price Band, Application Area, and Geography.

Browse Related Reports:

Enterprise Quantum Computing Market By Component (Hardware, Software, Services), By Application (Optimization, Simulation And Data Modelling, Cyber Security), By Geography, And Forecast

Healthcare Cognitive Computing Market By Technology (Natural Language Processing, Machine Learning), By End-Use (Hospitals, Pharmaceuticals), By Geography, And Forecast

Cognitive Computing Market By Component (Natural Language Processing, Machine Learning, Automated Reasoning), By Deployment Model (On-Premise, Cloud), By Geography, And Forecast

Cloud Computing In Retail Banking Market By Product (Public Clouds, Private Clouds), By Application (Personal, Family, Small and Medium-Sized Enterprises)

Top 10 Edge Computing Companies supporting industries to become 100% self-reliant

Visualize High-Performance Computing (HPC) Market using Verified Market Intelligence -:

Verified Market Intelligence is our BI Enabled Platform for narrative storytelling in this market. VMI offers in-depth forecasted trends and accurate Insights on over 20,000+ emerging & niche markets, helping you make critical revenue-impacting decisions for a brilliant future.

VMI provides a holistic overview and global competitive landscape with respect to Region, Country, Segment, and Key players of your market. Present your Market Report & findings with an inbuilt presentation feature saving over 70% of your time and resources for Investor, Sales & Marketing, R&D, and Product Development pitches. VMI enables data delivery In Excel and Interactive PDF formats with over 15+ Key Market Indicators for your market.

About Us

Verified Market Research is a leading Global Research and Consulting firm servicing over 5000+ customers. Verified Market Research provides advanced analytical research solutions while offering information enriched research studies. We offer insight into strategic and growth analyses, Data necessary to achieve corporate goals and critical revenue decisions.

Our 250 Analysts and SME's offer a high level of expertise in data collection and governance use industrial techniques to collect and analyze data on more than 15,000 high impact and niche markets. Our analysts are trained to combine modern data collection techniques, superior research methodology, expertise and years of collective experience to produce informative and accurate research.

We study 14+ categories from Semiconductor & Electronics, Chemicals, Advanced Materials, Aerospace & Defense, Energy & Power, Healthcare, Pharmaceuticals, Automotive & Transportation, Information & Communication Technology, Software & Services, Information Security, Mining, Minerals & Metals, Building & construction, Agriculture industry and Medical Devices from over 100 countries.

Contact Us

Mr. Edwyne FernandesVerified Market ResearchUS: +1 (650)-781-4080UK: +44 (753)-715-0008APAC: +61 (488)-85-9400US Toll Free: +1 (800)-782-1768Email: [emailprotected]Web: https://www.verifiedmarketresearch.com/Follow Us: LinkedIn | Twitter

Logo: https://mma.prnewswire.com/media/1315349/Verified_Market_Research_Logo.jpg

SOURCE Verified Market Research

Continued here:

High-Performance Computing (HPC) Market is expected to generate a revenue of USD 65.12 Billion by 2030, Globally, at 7.20% CAGR: Verified Market...

7 Best Quantum Computing Stocks to Buy in 2022 | InvestorPlace

Quantum computing offers the potential to harness big data, make intricate predictions and use artificial intelligence (AI) to revolutionize business operations. Many industries such as automotive, agriculture, finance, healthcare, energy, logistics and space will be affected from the growth in this technology. As a result, Wall Street has been paying significant attention to quantum computing stocks.

Once considered science fiction, quantum computing has made significant progress in recent years to solve complex problems at lightning speed. This advanced technology uses the power of quantum mechanics to represent complex problems. These computers can take seconds to calculate equations that normally take days for machines that use a binary framework.

International Data Corporation forecasts that the global market for quantum computing should grow from about $412 million in 2020 to more than $8.5 billion in 2027. This increase would mean a compound annual growth rate (CAGR) of an eye-popping 50% between now and 2027. Given such metrics, its understandable why investors are thrilled about the future of quantum computing stocks.

While it is currently in its early days, Wall Street has already warmed up to long-term prospects of this technology. Besides several pure-play quantum computing stocks going public in 2021, well-known tech names are pouring significant research dollars to invest in this advanced segment.

With that information, here are the seven best quantum computing stocks to buy in 2022:

52-week range: $142.25 $191.95

Dividend yield: 1.7%

Semiconductor group Analog Devices manufactures integrated circuits that process analog and digital signals. ADIs chips are used in data converters, high-performance amplifiers and microwave-integrated circuits.

Analog Devices issuedQ4 2021 metricson Nov. 23. Revenue increased 53% year-over-year (YOY) to $2.34 billion. Adjusted earnings soared from $1.44 per share to $1.73 per share. The company generated a free cash flow of $810 million. Cash and equivalents ended the period at $1.98 billion.

Factory automation has fueled demand for sensors and machine connectivity, which increasingly rely on Analogs chips. In addition, the automotive industry has also become a key growth driver due to the rising use of advanced electronics in electric vehicles (EVs).

In late August, the chipmaker completed theacquisition of Maxim Integrated. The billion transaction should increase ADIs market share in automotive and 5G chipmaking.

ADI currently trades just under $160, up 7% over the past 12 months. Shares are trading at 21.5 times forward earnings and 8.9 times trailing sales. The 12-month median price forecast for Analog Devices stock stands at $210.

52-Week Range:$42.96 $57.15

Expense Ratio:0.40% per year

QTUM is an exchange-traded fund (ETF) that focuses on the next generation of computing. It offers exposure to names leading the way in quantum computing, machine learning and cloud computing. The fund tracks the BlueStar Quantum Computing and Machine Learning Index.

QTUM, which started trading in September 2018, has 71 holdings. The top 10 holdings account for less than 20% of net assets of $161.5 million. Put another way, fund managers are not taking major bets on any company.

Among the leading holdings on the roster are the security and aerospace company Lockheed Martin(NYSE:LMT), French telecommunications operator Orange(NYSE:ORAN) and IBM.

For most retail investors, QTUM could potentially be a safe and diversified place to start investing in quantum computing. As portfolio companies come from a wide range of technology segments, wide swings in the price of one stock will not affect the ETF significantly.

The fund has gained 8.7% over the past year and saw an all-time high in November 2021. However, the recent selloff in tech stocks led to a 10.7% decline year-to-date (YTD). Interested readers could regard this decline as a good entry point into QTUM.

52-week range: $113.17 $146.12

Dividend Yield: 4.8%

Technology giant International Business Machines (IBM) needs little introduction. The legacy tech name offers integrated solutions and services, including infrastructure, software, information technology (IT) and hardware.

IBM announcedQ4 2021 financials on Jan. 24. The company generated revenues of $16.7 billion, up 6.5% YOY. Net income stood at $2.33 billion, or $2.57 per diluted share, up from $1.36 billion, or $1.51 per diluted share, in the prior-year quarter. Cash and equivalents ended the period at $6.65 billion.

After the announcement, CEO Arvind Krishna said, We increased revenue in the fourth quarter with hybrid cloud adoption driving growth insoftware and consulting.

The company launched its Quantum System One quantum computer in 2019. Around 150 research groups and partner companies currently use IBMs quantum computing services. These names come from financial services businesses, automakers and energy suppliers.

In June 2021, IBMunveiledEuropes most powerful quantum computer in Germany. Moreover, the tech giant recentlyannounced a deal with Raytheon Technologies(NYSE:RTX) to provide quantum computing and AI services for the aerospace, defense and intelligence industries.

IBM currently changes hands around $137, up 20% over the past 12 months. Shares are trading at 13.5 times forward earnings and 2.2 times trailing sales. The 12-month median price forecast for IBM stock is $144.50. Interested readers could consider buying IBM shares around these levels.

52-week range: $7.07 $35.90

IonQ is one of the first publicly traded pure-play quantum computing stocks. It went public via a merger with the special purpose acquisition company (SPAC) dMY Technology Group III in late 2021.

The quantum name released Q3 2021 results on Nov. 15. Its net loss was $14.8 million, or 12 cents loss per diluted share, compared to a net loss of $3.6 million a year ago. Cash and equivalents ended the quarter at $587 million. Wall Street was pleased that at the time, YTD contract bookings came in at $15.1 million.

IonQ is currently developing a network of quantum computers accessible from various cloud services. The technology uses ionized atoms that allow IonQs machines to perform complex calculations with fewer errors than any other quantum computer available.

The start-up has the financial backing of prominent investors, including Bill Gates and the Japanese telecommunications companySoftbank Group(OTCMKTS:SFTBF). In addition, IonQ has been developing strategic partnerships with Microsoft, Amazons (NASDAQ:AMZN) Amazon Web Services and Alphabets(NASDAQ:GOOG, NASDAQ:GOOGL) GoogleCloud.

While IonQ is taking steps to become a commercialization-stage name, it is still a speculative investment. With its potential for explosive growth, it could be an attractive quantum computing stock for investors looking to take a risk.

IONQ stock hovers around $12. The recent selloff in tech stocks has led to a 26.7% decline YTD. Yet, the 12-month median price forecast for IONQ stock stands at $23.

52-week range: $224.26 $349.67

Dividend Yield: 0.8%

Microsoft is one the largest and most prominent technology firms worldwide. It offers software products and services, including Azure cloud service, the Office 365 productivity suite and the customer relationship management (CRM) platform Dynamics 365.

Meanwhile, the Microsoft Quantum is the worlds first full-stack, open cloud quantum computing ecosystem that allows developers to create quantum applications and run them on multiple platforms. The software giant provides quantum computing services via the cloud on Azure.

Management announced robust Q2 FY22 metricson Jan. 25. Revenue increased 20% YOY to $51.7 billion. Net income surged 21% YOY to $18.8 billion, or $2.48 per diluted share, compared to $15.5 billion, or $2.03 per diluted share, in the prior-year quarter. Cash and equivalents ended the period at $20.6 billion.

On Jan. 18, Microsoft announced plans to acquire Activision Blizzard(NASDAQ:ATVI), a leading player in game and interactive entertainment development. It will be an all-cash transaction valued at $68.7 billion. Wall Street expects this deal to provide tailwinds for Microsofts gaming business and building blocks for the metaverse.

MSFT stock currently trades just under $310, up 27% over the past 12 months. Shares support a valuation of 32.6 times forward earnings and 12.3 times trailing sales. And the 12-month median price forecast for Microsoft stock stands at $370.

52-week range: $115.67 $346.47

Dividend Yield: 0.06%

Santa Clara, California-based Nvidia has become an important name in advanced semiconductor design and software for next-generation computing development. InvestorPlace readers likely know the chipmaker is a market leader in the gaming and data center markets.

Nvidia announced impressive Q3 FY 2022 numbers on Nov. 17. Revenue soared 50% YOY to a record $7.1 billion, fueled by record sales in the gaming and data center businesses. Net income increased 62% YOY to $2.97 billion, or $1.17 per diluted share. Cash and equivalents ended the period at $1.29 billion.

The chipmaker provides the necessary processing power that drives the development of quantum computing. Additionally, Nvidia recently released cuQuantum, a software development kit designed for building quantum computing workflows. It has partnered with Google, IBM and other quantum computing players that rely on cuQuantum to accelerate their quantum computing work.

Given its growing addressable market in cloud computing, gaming, AI, and more recently the metaverse, NVDA stock deserves your attention. Share are changing hands around $245, up nearly 80% over the past year. However, despite an 18% decline YTD, shares are trading at 46.5 times forward earnings and 25 times trailing sales.

Finally, the 12-month median price forecast for Nvidia stock is $350. As the company gets ready to report earnings soon, investors should expect increased choppiness in price.

52-week range: $9.62 $12.75

Our final stock is Supernova Partners Acquisition II, a SPAC. It is merging with Rigetti Computing, a start-up focused on quantum computer development. As a result of the merger, Rigetti Computing was valued at about $1.5 billion and received $458 million in gross cash proceeds.

Rigetti designs quantum chips and then integrates those chips with a controlling architecture. It also develops software used to build algorithms for these chips.

Rigetti recently announced business highlightsfor the nine months ended Oct. 31, 2021. Revenue came in at $6.9 million. Net operating loss declined 3% YOY to $26.2 million.

We believe the time for quantum computing has arrived, said founder and CEO Chad Rigetti. Customer demand is increasing as Rigetti quantum computers begin to address high-impact computational problems.

The start-up launched the worlds first scalable multi-chip quantum processor in June 2021. This processor boasts a proprietary modular architecture. Now Wall Street expects the company to move toward commercialization.

Rigetti collaborates with government entities and technology to advance its quantum processors. For instance, it boasts strategic partnerships with the National Aeronautics and Space Administration (NASA) and the U.S. Department of Energy. It also works with data analytics firm Palantir Technologies (NYSE:PLTR) and electronics manufacturer Keysight Technologies(NYSE:KEYS).

SNII stock is currently shy of $10, down about 4% YTD. As investors interest in quantum computing names grow, shares are likely to become hot.

On the date of publication, Tezcan Gecgil holds both long and short positions in NVDA stock. The opinions expressed in this article are those of the writer, subject to the InvestorPlace.comPublishing Guidelines.

TezcanGecgil has worked in investment management for over two decades in the U.S. and U.K. In addition to formal higher education in the field, she has also completed all 3 levels of the Chartered Market Technician (CMT) examination. Her passion is for options trading based on technical analysis of fundamentally strong companies. She especially enjoys setting up weekly covered calls for income generation.

More here:
7 Best Quantum Computing Stocks to Buy in 2022 | InvestorPlace

Strategic Partnership Agreement to Develop the Quantum Computing Market in Japan and Asia-Pacific – PR Newswire

TOKYO, CAMBRIDGE, England and BROOMFIELD, Colo., Oct. 18, 2022 /PRNewswire/ -- Mitsui & Co., Ltd ("Mitsui") and Quantinuum have signed a strategic partnership agreement to collaborate in the delivery of quantum computing in Japan and the Asia-Pacific region.

Mitsui, which is committed to digital transformation, and Quantinuum, one of the world's leading quantum computing companies, integrated across hardware and software, have entered this strategic partnership to develop quantum computing use cases, which are expected to drive significant business transformation and innovation in the future.

Mitsui and Quantinuum will accelerate collaboration, cooperation, and development of new business models. They will jointly pursue quantum application development and provide value added services to organizations working across a variety of quantum computing domains, which is expected to be worth US$450B US$850B worldwide by 2040.*

Yoshio Kometani, Representative Director, Executive Vice President and Chief Digital Information Officer of Mitsui & Co., Ltd. stated:"We are very pleased with the strategic partnership between Mitsui and Quantinuum. By combining Quantinuum's cutting-edge quantum computing expertise and diverse quantum talents with Mitsui's broad business platform and network, we will work together to provide new value to our customers and create new business value in a wide range of industrial fields."

Ilyas Khan, Founder and CEO of Quantinuum stated:"The alliance between Mitsui and Quantinuum demonstrates our shared commitment to accelerating quantum computing across all applications and use cases in a diverse range of sectors, including chemistry, finance, and cybersecurity. Today's announcement reinforces our belief in the global quantum leadership shown by corporations and governments in Japan, pioneered by corporate leaders like Mitsui."

Details of the Strategic Partnership

Collaboration areas and applications

Recent Achievements by Quantinuum

About Mitsui & Co., Ltd.

Location: 1-2-1 Otemachi, Chiyoda-ku, Tokyo

Established: 1947

Representative: Kenichi Hori, President and Representative Director

Mitsui & Co., Ltd. (8031: JP) is a global trading and investment company with a diversified business portfolio that spans approximately 63 countries in Asia, Europe, North, Central & South America, The Middle East, Africa and Oceania.

Mitsui has about 5,500 employees and deploys talent around the globe to identify, develop, and grow businesses in collaboration with a global network of trusted partners. Mitsui has built a strong and diverse core business portfolio covering the Mineral and Metal Resources, Energy, Machinery and Infrastructure, and Chemicals industries.

Leveraging its strengths, Mitsui has further diversified beyond its core profit pillars to create multifaceted value in new areas, including innovative Energy Solutions, Healthcare & Nutrition and through a strategic focus on high-growth Asian markets. This strategy aims to derive growth opportunities by harnessing some of the world's main mega-trends: sustainability, health & wellness, digitalization and the growing power of the consumer.

Mitsui has a long heritage in Asia, where it has established a diverse and strategic portfolio of businesses and partners that gives it a strong differentiating edge, provides exceptional access for all global partners to the world's fastest growing region and strengthens its international portfolio.

For more information on Mitsui & Co's businesses visit, https://www.mitsui.com/jp/en/index.html

About Quantinuum

Location: Cambridge, U.K., Broomfield, Colorado, U.S.A.

Established: December 2021 (through the merger of Honeywell Quantum Solutions (U.S.) and Cambridge Quantum Computing (U.K.))

Representative: Ilyas Khan, CEO; Tony Uttley, COO; Shuya Kekke, CEO & Representative Director, Japan

Quantinuum is one of the world's largest integrated quantum computing companies, formed by the combination of Honeywell Quantum Solutions' world-leading hardware and Cambridge Quantum's class-leading middleware and applications. Science-led and enterprise-driven, Quantinuum accelerates quantum computing and the development of applications across chemistry, cybersecurity, finance, and optimization. Its focus is to create scalable and commercial quantum solutions to solve the world's most pressing problems in fields such as energy, logistics, climate change, and health. The company employs over 480 individuals, including 350 scientists, at nine sites across the United States, Europe, and Japan.

Selected major customers (in Japan): Nippon Steel Corporation, JSR Corporation

http://www.quantinuum.com

Photo - https://mma.prnewswire.com/media/1923231/Quantinuum.jpgPhoto - https://mma.prnewswire.com/media/1923232/Quantinuum_System_Model.jpg

SOURCE Quantinuum LLC

See the article here:
Strategic Partnership Agreement to Develop the Quantum Computing Market in Japan and Asia-Pacific - PR Newswire

Quantum Leap: "The big bang of quantum computing will come in this decade" – CTech

In the few images that IBM has released, its quantum computing lab looks like the engine room of a spaceship: bright white rooms with countless cables dangling from the ceiling down to a floating floor, pierced with vents. This technological tangle is just the background for the main show: rows of metal supports on which hang what look like... white solar boilers.

There, within these boilers, a historical revolution is taking shape. IBM, a computing dinosaur more than a century old, is trying to reinvent itself by winning one of the most grueling, expensive and potentially promising scientific races ever: the race to develop the quantum computer. "We are living in the most exciting era in the history of computing," says Dario Gil, Senior Vice President of IBM and head of the company's research division, in an exclusive interview with Calcalist. "We are witnessing a moment similar to the one recorded in the 40s & 50s of the last century, when the first classic computers were built." A few weeks after this conversation, his statements were further confirmed, when the Nobel Prize Committee announced the awarding of the prize in the field of physics to three researchers whose research served as a milestone in the development of the field.

The name Dario Gil shakes a lot of quanta and cells in the brains, and maybe even in the hearts, of physicists and computer engineers all over the world. This is the person who leads the most advanced effort in the world to develop a quantum computer. In September, when Gil landed in Tel Aviv for a short visit to give the opening lecture at the IBM conference, the hall was packed with senior engineers, researchers from the top universities in Israel, and representatives of government bodies - all enthralled by what Gil had to say.

2 View gallery

Dario Gil.

(Photo: Elad Gershgoren)

Gil (46) was born in Spain and moved to the United States to study at MIT University. He completed his doctoral studies there, and immediately after graduation began working at IBM in a series of research and development positions. Since 2019, he has been leading the company's research division, which has 3,000 engineers at 21 sites, including Israel. Under his management, in 2016, IBM built the first quantum computer whose services are available to anyone: if you have a complicated question, you can go to the IBM Quantum Experience website, remotely access one of the quantum computers through the cloud - and, perhaps, receive an answer. But as with everything related to quantum computing, it just sounds simple.

"Quantum computing is not just a name for an extremely fast computer," says Gill. In fact, he explains, the quantum computer is no longer a supercomputer that uses the same binary method that is accepted in every classical computer, but a completely new machine, another step in the evolution leading from strings of shells, through beaded invoices and calculating bars, to gear-based mechanical computers, to the electronic computer and now to the quantum computer. "Essentially, the quantum computer is a kind of simulator of nature, through which it is possible to simulate natural processes, and thus solve problems that previously had no solution," explains Gil. "If the classical computer is a combination of mathematics and information, then quantum computing is a combination of physics and information."

This connection makes it possible to solve certain types of problems with unprecedented speed: Google, which is also developing a quantum computer, claimed in 2019 that it had reached "quantum supremacy" a demonstration of a calculation that a quantum computer would perform more efficiently than a classical computer. The researchers at Google showed how a quantum computer performed in 200 seconds a calculation that they claim would have required a classical computer ten thousand years to complete. This claim has since been disproved by other researchers, who have presented an algorithm that allows a classical computer to perform the same calculation in a reasonable amount of timebut even this Google failure provides an idea of the enormous power a quantum computer will have.

"The quantum computer does not make the classical computer superfluous: they will live together, and each of them will solve different problems," explains Gil. "It's like asking you how to get from point A to point B: you can walk, ride a bicycle, travel by car or fly. If the distance between these points is 50 km, you won't fly between them, right? Accordingly, it is a mode suitable for a classic computer. A quantum computer allows you to fly, even to the moon, and quickly."

You will soon explain to me how it works, and in which areas exactly, but before that, let's start from the bottom line: what can we do with it?

"Quantum computing will make it possible to crack a series of problems that seemed unsolvable, in a way that will change the world. Many of these issues are related to energy. Others are related to the development of new and exciting materials. We tend to take the materials available to us for granted, but in the past there were eras that were defined by the materials that dominated them - The Stone Age', the 'Bronze Age', the 'Iron Age'. Quantum computing will help us develop materials with new properties, therefore the first sector that is already using it is industry, especially the car industry: the car manufacturers are interested in better chemistry, which will enable the production of more efficient and durable batteries for electric vehicles. For a normal computer this is a huge task, and to complete it we have to give up accuracy and settle for approximate answers only, but quantum computing can help quickly develop materials that will fit the task, even without entering the lab. The efficiency of a quantum computer when it comes to questions in chemistry is also used in the pharmaceutical industry, There they are beginning to make initial use of such computers to examine the properties of molecules, and in this way to speed up the development of new drugs; and also in the fertilizer industry, which will be able to develop substances whose production will not harm the environment.

The uses are not limited to the material world. "For the financial sector, for example, the quantum computer enables the analysis of scenarios, risk management and forecasting, and the industry is already very interested in such possible applications, which could provide the general public with dramatically improved performance in investment portfolios, for example.

2 View gallery

IBM.

(Photo: Shutterstock)

At the same time, there are industries that quantum computing will force to recalculate their course, and the information security industry is at the forefront. The modern encryption systems (mainly RSA, one of whose developers is the Israeli Prof. Adi Shamir) are asymmetric: each recipient publishes a code that allows the information sent to them to be encrypted ("public key"), which includes the product of two large prime numbers that are kept secret. To decipher the encrypted information, this product must be broken down into factors - but without knowing what the initial numbers are, "this task would require a normal computer to calculate for many years," explains Gil. "However, for the quantum computer, such a calculation can be a matter of seconds."

There is a real threat here to an entire industry, the logic behind which has been built since the 1970s, and now suddenly the ground is cracking under it.

"True, a normal computer needs ten thousand years to solve an encryption that a quantum computer would solve in an instant. That is why the quantum computer threatens the world of cyberspace and encryption, which are the basis of all global information security. This is an example that is not related to physics or nature, but simply to the stronger and faster computing power of the quantum computer.

The computer that works against all the rules of intuition

To understand the power of the quantum computer, this concept, "quantum computing", must first be broken down. The first step is to stop thinking in the familiar concepts of one and zero. Forget about bits and binaries. The key to understanding quantum computing is the recognition that this dichotomy is not there: instead of the bit, quantum computing relies on a basic unit of information called a qubit (short for "quantum bit"). The qubit is simultaneously one, zero and everything in between.

This is the moment to stop and explain the theory that underlies the quantum computer, and which seems to go against common sense. "Quantum theory makes it possible to explain the behavior of very, very small particles," Gil explains. "At school we are presented with a model of an atom that looks like a planet, with a nucleus and electrons moving around, but at the beginning of the 20th century, this model turned out to be not very accurate." This happened when physicists such as Max Planck and Albert Einstein realized that light, which until then physics saw as a wave, also behaves as a particle - and the energy of this particle can only be described in "quantum" jumps, that is, as discrete packets. In the decades that followed, this theory was developed more and more, and proved to be effective in describing a variety of phenomena in the world of particles. And yet, its deep meanings remain obscure even today.

Such is, for example, the idea that a particle is in more than one place. According to quantum theory, a particle moving between two points moves simultaneously in all the paths between them, a state called "superposition". It's not that we don't know its exact location: it just doesn't have one. Instead, it has a distribution of possible locations that coexist. In other words, reality is not certain, but probabilistic.

And this is not the only puzzle posed by quantum theory. Another confusing concept is "entanglement", a situation in which several particles exhibit identical physical values, and respond simultaneously to a change in one of them, even if they are at a great distance from each other. Gil suggests thinking of it as tossing two coins: anyone who has studied statistics knows that the probabilities of getting a "head" or a "tail" on each of them are independent. But in the quantum model, if the coins (representing particles here) are intertwined, then tossing one of them will result in the same result in the other. "Einstein didn't believe in interweaving, and hated these patterns," Gil says with a smile.

Measurements that affect the results? A reality that is not absolute but statistical? Particles that become twins even at infinite distance? If these ideas sound puzzling, incomprehensible or counter-intuitive to you, you are not alone: "Whoever comes across quantum theory and is not left stunned, has not understood it," said the physicist Niels Bohr, Einstein's contemporary and his great nemesis, who won the Nobel Prize for his contribution to the development of the theory (Einstein, by the way, had reservations about Bohr's interpretation of the theory's conclusions). Another physicist who won the Nobel Prize for his contribution to the theory, Richard Feynman, commented on this when he said: "If you think you have understood quantum theory, you have not."

The same Feynman is the father of quantum computing: he wanted to simulate the behavior of particles, but due to the probabilistic nature of the theory, a classical computer that would try to perform such a simulation would require an enormous amount of calculations, so that the simulation would become impractical. "Feynman, and like him other physicists, thought that the field of computing focused on mathematical horizons and moved too far away from nature, and that physics could be more connected to the world of information," explains Gil. "In a historic lecture he gave in 1981, Feynman claimed that there was nothing to give a classical computer to deal with particle simulation, because nature is not classical. He said, 'If we want to simulate nature, we need a machine that behaves like nature, in a quantum way.'" In 1998, this vision was realized, when the first quantum computer was built at the University of Oxford in Great Britain.

A quantum computer utilizes the enigmatic properties of quantum theory, those that are not fully understood by us, to perform calculation operations. In a normal computer, the basic unit of information is a "bit", which can have one of two values, 0 or 1; Using such bits makes it possible to perform any calculation imaginable - although some of these calculations may take a very long time. In a quantum computer, the qubit, thanks to superposition, represents not one absolute value, but a distribution of values. "You can think of it as a question of more dimensions: one and zero are just the ends, the poles of a coin for example, but it can also have a sideways tilt," explains Gil. Using statistical approaches it is possible to examine the state of the qubit and obtain useful results. This probabilistic approach is not suitable for every problem, but in solving certain problems it is infinitely more efficient than the classical computer's search for an absolute answer.

"Because of the entanglement effect, it is also possible to cause the qubits to influence each other," says Gil. And since each qubit represents an entire field of possibilities, each addition of a qubit increases the number of possible connections between the qubits with exponentially increasing power (in the classical computer, on the other hand, the addition of bits grows linearly). At the moment, IBM holds the record for qubits: last year it unveiled a quantum processor with 127 qubits, and its stated goal is to launch a processor with 433 qubits this year, and a processor with 1,021 qubits next year.

Three degrees colder than outer space

This ambition is more pretentious than it seems. It turns out that "building a machine that will behave like nature" is a complex story like no other: the qubits are very sensitive to outside influences, which makes building a computer a very complicated and expensive business. "The quantum computer is very powerful, but at the same time also very delicate," explains Gil: "It utilizes physical processes that occur in the world, but such processes are a system in which everything is connected, everything affects everything, and this can disrupt the results: if energy from the outside world goes inside and connect to the qubits, this will make them behave like normal bits, and thus the unique ability of quantum computation will be lost. Therefore, a quantum computer must be very isolated from the entire environment. The big challenge is to produce a system that is sufficiently isolated from the outside world, but not too isolated."

When I try to find out what the cost of building a quantum computer is - and IBM has already built 40 of them - Gil avoids a clear answer, but it is enough to hear what this effort entails: "There are several different approaches to building a quantum computer; IBM chose a cryogenic approach, meaning deep freezing, and the use of superconductors. The temperature in the computer is close to absolute zero: at the bottom of its case the temperature is minus 273 degrees Celsiusthree degrees less than the temperature of outer space, and less than one degree above absolute zero. The temperature should be close to absolute zero, but not reach it, because then there is no movement at all, Not even of the atoms."

The result is a cooling and protection case that resembles a water heater in its shape, and inside it has the calculation unit, whose shape gave it the nickname "chandelier" according to Gil and his team. "Inside the layers of protection there is a cylinder with the processor in it. Even if only a fraction of an energy particle enters the computer, literally a fraction of nothing, it will be enough to disrupt the results," Gil clarifies.

The great sensitivity, and the protection requirements derived from it, mean that the quantum computer is quite cumbersome: in the newest models, which try to include more and more qubits, the case already reaches a height of several meters. To some extent it is reminiscent of the first generations of classic computers, which looked like huge cabinets. Those classic computers kept getting smaller and smaller, until today we squeeze millions of times more computing power into a simple smartphone, but in the case of quantum computers, we cannot expect a similar process: "The quantum computer requires unique conditions that cannot be produced in a simple terminal device, and this will not change in the foreseeable future," Gil explains. "I believe that quantum computing will be a service that we can access remotely, as we access cloud services today. It will work similar to what IBM already enables today: the computer sits with us, and we make it possible to access the 'brain' and receive answers. Of the 40 computers we have built since 2016, today 20 are available to the public. About half a million users all over the world have already made use of the capabilities of the quantum computer we built, and based on this use, about a thousand scientific publications have already been published."

Google and Microsoft are heating up the competition

IBM is not the only company participating in the quantum computing race, but Gil exudes full confidence in its ability to lead it: according to him, most competitors only have parts of the overall system, but not a complete computer available to solve problems. Google, as mentioned, is a strong contender in this race, and it also allows remote access to its quantum computing service, Google Quantum AI; Microsoft is also working to provide a similar service on its cloud platform, Azure.

Meanwhile, quantum computing is a promise "on paper". The theoretical foundations for this revolution were laid already 40 years ago, the first proofs were presented more than 20 years ago, the industry has been buzzing around this field for several years - and we still haven't seen uses that would serve a regular person.

"If you go back to the 1940s, when the first computers were invented, you will see that even then the uses and advantages of the new invention were not clear. Those who saw the first computers said, 'Oh, great, you can use it to crack the code of encryption machines in wars, maybe even calculate routes of ballistic missiles, and that's it. Who's going to use it? Nobody,'" Gil laughs. "In the same way, the success of quantum computing will depend on its uses: how easy it will be to program, how large the community of users will be, what talents will get there. The quantum revolution will be led by a community, which is why education for this field is so important: we need more and more smart people to start to think 'how can I use quantum computing to advance my field'.

"What is beginning these days is the democratization phase of quantum computing, which will allow anyone to communicate with the computer without being an advanced programmer in the field: it will be possible to approach it with a question or a task that will be written in the classical languages of one or zero. That is why we are already seeing more use of quantum computing capacity today.

"There are also many startups that do not actually work to establish a quantum computer, but focus on various components of this world (for example, the Israeli company Quantum Machines, which develops hardware and software systems for quantum computers, and last July was selected by the Innovation Authority to establish the Israeli Quantum Computing Center). The activity of such companies creates a completely new ecosystem, thus promoting the industry and accelerating its development, just as is happening today in the field of ordinary computers. IBM will not rely only on itself either: we would like to benefit from the innovation of smart people in this field, of course also in Israel.

"I am convinced that the big bang of quantum computing will happen in this decade. Our ambition at IBM is to demonstrate 'quantum supremacy' already in the next three years. I believe that the combination of advances in artificial intelligence, together with quantum computing, will bring about a revolution in the industry of the kind that Nvidia made in its market (Nvidia developed unique processors for gaming computers, which made it the chip company that reached a billion dollar revenue the fastest.) Quantum computing can generate enormous value in the industry. It is phenomenally difficult, but it is clear to me that we will see the uses already in the current decade."

The Nobel Prize opens a new horizon for quantum computing

Quantum computing has ignited the imagination of researchers for many decades, but until now it has not left the confines of laboratories. However, the awarding of the Nobel Prize to three researchers in the field indicates that the vision is becoming a real revolution. Alain Aspect of France, the American John Clauser and Austrian Anton Zeilinger received the award for research they conducted (separately) since the 1970s, in which they examined the phenomenon of quantum entanglement (described in the article), proved its existence and laid tracks for its technological use.

The awarding of the Nobel Prize to the entanglement researchers proves that quantum computing is more than a mental exercise for a sect of physicists, and is a defining moment for companies that invest capital in the development of the field. They are pushed to this effort due to a fundamental change in the world in which they operate: in recent decades, the world of computing has operated according to "Moore's Law", which foresees that the density of transistors in computer processors will double every two years in a way that will increase the computing power of these chips. However, as the industry approaches the physical limit after which it will be impossible to cram more transistors onto a chip, the need to develop a quantum computer has become acute.

The numbers also signal that something is happening in the field. In 2020, the scope of the quantum computing market was less than half a billion dollars, but at the end of 2021, in a signal that the vision is beginning to be realized, the research company IDC published an estimate according to which in 2027 the scope of the market will reach $8.6 billion and investments in the field will amount to $16 billion (compared to $700 million in 2020 and $1.4 billion in 2021). IBM CEO Arvind Krishna also recently estimated that in 2027 quantum computing will become a real commercial industry.

Link:
Quantum Leap: "The big bang of quantum computing will come in this decade" - CTech

VW teams with Canadian quantum computing company Xanadu on batteries – Automotive News Canada

Quantum computing, Ardey added in a release, might trigger a revolution in material science that will feed into the companys in-house battery expertise.

Leaving the bits and bytes of classical computing behind, quantum computers rely on qubits, and are widely seen as having potential to solve complex problems that traditional computers could not work through on reasonable timelines.

The automaker and Toronto-based technology firm have already been collaborating on research into material science, computational chemistry, and quantum algorithms for about a year. That early work set the foundation for the formal partnership, Volkswagen said.

The goal of the research is to develop quantum algorithms that can simulate how a blend of battery materials will interact more quickly than traditional computer models. Computational chemistry, which is traditionally used for such work, Ardey said, is reaching limitations when it comes to battery research.

Juan Miguel Arrazola, head of algorithms at Xanadu, said the partnership is part of the Canadian companys drive to make quantum computers truly useful.

Focusing on batteries is a strategic choice given the demand from industry and the prospects for quantum computing to aid in understanding the complex chemistry inside a battery cell.

Using the quantum algorithms, Volkswagen said it aims to develop battery materials that are safer, lighter and cheaper.

Here is the original post:
VW teams with Canadian quantum computing company Xanadu on batteries - Automotive News Canada