A new nanomaterial for precision medicine and the green transition – EurekAlert

image:representation of the nanocluster view more

Credit: Politecnico di Milano

The SupraBioNano Lab (SBNLab) at the Politecnico di Milanos Department of Chemistry, Materials and Chemical Engineering "Giulio Natta, in partnership with the University of Bologna and the Aalto University of Helsinki (Finland) has, for the first time, synthesised a superfluorinated gold nanocluster, made up of a core of only 25 gold atoms, to which 18 branch-structured fluorinated molecules are linked. The project was recently published in the prestigious Nature Communications magazine.

The metal clusters are an innovative class of very complex nanomaterial, characterised by ultra-small dimensions (<2nm) and peculiar chemical-physical properties such as luminescence and catalytic activity, which encourage its application in various scientific fields of high importance in relation to modern global challenges. These include precision medicine, in which metal nanoclusters are used as innovative probes for diagnostic and therapeutic applications, and the energy transition, where they are applied as efficient catalysers for the production of green hydrogen.

The crystallisation of metal nanoclusters offers the possibility of obtaining high-purity samples, allowing their fine atomic structure to be determined; however, at present this remains a very difficult process to control. The methodologies developed in this study promoted the crystallisation of nanoclusters, allowing their atomic structure to be determined by means of x-ray diffraction at the Sincrotrone Elettra in Trieste. The end result is the structural description of the most complex fluorinated nano-object ever reported.

Thanks to the presence of a completely fluorinated shell, containing almost 500 fluorine atoms, the gold nanocluster is stabilised by the numerous interactions between the fluorine atoms of the binder, encouraging crystallisation, states professor Giancarlo Terraneo.

It will soon be possible to study the structure of these advanced nanomaterials at the Politecnico di Milano, where thanks also to the grant from the Region of Lombardy Next-GAME (Next-Generation Advanced Materials), a laboratory dedicated to the use of state-of-the-art x-ray instruments to characterise crystals, nanoparticles and colloids, is being established, concludes professor Pierangelo Metrangolo, on behalf of Next-GAME.

The interactions between the fluorine atoms both within the nanocluster and between the nanoclusters were rationalised using quantum chemistry techniques at the University of Bolognas G. Ciamician Chemistry Department by Dr Angela Acocella and professor Francesco Zerbetto.

Professor Valentina Dichiarante, professor Francesca Baldelli Bombelli, Dr Claudia Pigliacelli and professor Giulio Cerullo, from the Politecnico di Milanos Physics Department, also contributed to the study, looking at the nanoclusters optical characteristics and demonstrating the fluorinated binders impact on the gold cores optical activity.

The High-resolution crystal structure of a 20kDa superfluorinated gold nanocluster study, C. Pigliacelli et al. Nat. Commun. 2022,13, 2607 is available at the following link: https://www.nature.com/articles/s41467-022-29966-2

Nature Communications

Experimental study

Not applicable

High-resolution crystal structure of a 20kDa superfluorinated gold nanocluster

11-May-2022

Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.

Read the original:
A new nanomaterial for precision medicine and the green transition - EurekAlert

Origami millirobots bring health care closer to precision drug delivery – Nanowerk

Jun 15, 2022(Nanowerk News) If youve ever swallowed the same round tablet in hopes of curing everything from stomach cramps to headaches, you already know that medicines arent always designed to treat precise pain points.While over-the-counter pills have cured many ailments for decades, biomedical researchers have only recently begun exploring ways to improve targeted drug delivery when treating more complicated medical conditions, like cardiovascular disease or cancer.A promising innovation within this burgeoning area of biomedicine is the millirobot. These fingertip-sized robots are poised to become medicines future lifesavers to crawl, spin, and swim to enter narrow spaces on their mission to investigate inner workings or dispense medicines.Origami millirobot with spinning-enabled propulsion. (Image: Zhao Lab)Leading research in this field, Stanford University mechanical engineer Renee Zhaois working on many millirobot designs at once including a magnetic crawling robot, which was recently seen worming its way through a stomach on the cover of Science Advances. Powered by magnetic fields which allow for continuous motion and can be instantly applied to generate torque and change the way the robots move her robots can self-select different locomotive states and overcome obstacles in the body. Just by shifting the strength and orientation of the magnetic field, Zhaos team can send the robot sailing across the body at distances in a single leap that are 10 times the robots length.A key aspect of her research, the magnetic actuation also provides untethered control for non-invasive operation and separates the control unit from the device to allow for miniaturization. Zhao said their most recent robot, featured in Nature Communications ("Spinning-enabled wireless amphibious origami millirobot"), is the most robust and multifunctional untethered robot we have ever developed.This new spinning-enabled wireless amphibious origami millirobot is as multifunctional as its name implies. Its an elegantly conceived single unit thats able to speedily travel over an organs slick, uneven surfaces and swim through body fluids, propelling itself wirelessly while transporting liquid medicines. Unlike pills swallowed or liquids injected, this robot withholds medicine until it reaches the target, and then releases a high-concentration drug, said Zhao, who is an assistant professor of mechanical engineering. That is how our robot achieves targeted drug delivery.Reshaping drug deliveryWhats groundbreaking about this particular amphibious robot, according to Zhao, is that it goes beyond the designs of most origami-based robots, which only utilize origamis foldability to control how a robot morphs and moves.On top of looking at how folding could enable the robot to perform certain actions imagine an accordion fold that squeezes out medicine Zhaos team also considered how the dimensions of each folds exact shape influenced the robots rigid motion when it was not folded. As a result, the robots unfolded form inherently lends itself to propulsion through the environment. Such broad-minded considerations allowed the researchers to get more use out of the materials without adding bulk and in Zhaos world, the more functionality achieved from a single structure within the robots design, the less invasive the medical procedure is.Another unique aspect of the design of the robot is the combination of certain geometrical features. A longitudinal hole into the robots center and lateral slits angled up the sides reduced water resistance and helped the robot swim better. This design induces a negative pressure in the robot for fast swimming and meanwhile provides suction for cargo pickup and transportation, Zhao said. We take full advantage of the geometric features of this small robot and explore that single structure for different applications and for different functions.Based on conversations with Stanford Department of Medicine experts, the Zhao Lab is considering how to improve upon current treatments and procedures by building new technologies. If this work goes Zhaos way, her robots wont just provide a handy way to effectively dispense medicine but could also be used to carry instruments or cameras into the body, changing how doctors examine patients. The team is also working on using ultrasound imaging to track where robots go, eliminating any need to cut open organs.The smaller, simpler, the betterWhile we wont see millirobots like Zhaos in real health care settings until more is known about optimal design and imaging best practices, the labs first-of-its-kind swimmer highlighted in Nature Communications is among their robots that are furthest along. Its currently in the trial stages that come before any live animal testing that proceeds human clinical trials.In the meantime, Zhaos team continues combining a variety of novel smart materials and structures into unique designs that ultimately form new biomedical devices. She also plans to continue scaling down her robots to further biomedical research at the microscale.As an engineer, Zhao strives to develop the simplest structures with the most functionality. Her amphibious robot exemplifies that mission, as it inspired her team to more fully consider geometric features not yet commonly prioritized by other origami robot researchers. We started looking at how all these work in parallel, Zhao said. This is a very unique point of this work, and it also has broad potential application in the biomedical field.

Read the original here:
Origami millirobots bring health care closer to precision drug delivery - Nanowerk

New Amrita Hospital is all set to open in Faridabad in August this year; 2,400-bed facility will become Indias biggest private hospital – The…

Amrita Hospitals announced on Thursday that its new 2,400-bed campus will soon be open to the public in Faridabad in August this year. During the press conference on Thursday, hospital management announced that the new Amrita Hospital is spread across 133 acres of land in Faridabad and it will be the biggest private sector hospital in India.

This would be the second large-scale Amrita Hospital in India after the iconic 1,200-bed Amrita Hospital in Kochi, Kerala, which was established 25 years ago by the Mata Amritanandamayi Math.

The new hospital is located at Sector 88, Faridabad and it will have a total built-up area of 1 crore sq. ft., including a 14-floor-high tower that will encompass the key medical facilities and patient areas. During the press conference, Swami Nijamritananda Puri, Head, Mata Amritanandamayi Math, Delhi announced that the 81 specialties at the hospital will include eight centers of excellence, such as oncology, cardiac sciences, neurosciences, gastro-sciences, renal sciences, bone diseases and trauma, transplants, and mother and child.

The hospital will become operational in stages, with 500 beds opening in August this year. In two years, this number will rise to 750 beds, and further to 1,000 beds in five years. When fully operational, the hospital will have a staff of 10,000 people, including over 800 doctors.

On how the new hospital has incorporated the aspects of pandemic-induced demands, Dr. Sanjeev K Singh, Medical Director, Amrita Hospital, Faridabad told Financial Express.com: We have learned a lot from the pandemic. The construction of the hospital began 5-6 years ago and the learnings from the pandemic also got incorporated along the way. For example, any patient who comes in an emergency gets facilitated in a 40-bed setup. In that set-up, we have a decontaminated area in which anyone who needs to shower will be sent there. We have four negative pressure rooms and if we have any suspected cases of covid or covid-like diseases we can send them to concerned specialists. The mechanism of shifting is also planned and implemented. In all critical care units, there are positive pressure isolation rooms.

The massive facility will also include 534 critical care beds which is the highest in India, the hospital management claims. The hospital campus will also include 64 modular operation theaters, most advanced imaging services, fully automated robotic laboratory, high-precision radiation oncology, most updated nuclear medicine, and state-of-the-art 9 cardiac and interventional cath lab for clinical services. Cutting-edge medical research will be a strong thrust area, with a dedicated research block spread across a 7-floor building totaling 3 lakh sq. ft with exclusive Grade A to D GMP lab with focus on identifying newer diagnostic markers, AI, ML, Bioinformatics etc.

Dr. Singh also told Financial Express.com that they want to integrate all aspects of medical science and bridge the gap between clinicians and scientists.

In Kochi, we have established tissue engineering, a nano-medicine-based cardiac stent, bone growth, and lots more. What we are looking at Faridabad campus is developing something new in stem-cell therapies. We want to create techniques like creating human cells on our own in our GMP labs as generally, we rely on international counterparts for such procedures. Recently, we conducted research in which we found that we can use patient pluripetin stem cells in tumours and it will destroy them. For us, oncology is the big thrust area but other areas will be a focus too. The intent of our research facility will be to make the high-end expensive equipment and treatments cost-effective for the common man. We want to integrate medicine, engineering, biotechnology, and other segments altogether, Dr. Singh told Financial Express.com.

Dr. Singh also said that they have already been awarded the Advanced ICMR Clinical Trial Unit and this will enable them to conduct their trials in the new facility.

Mata Amritanandamayi has allocated a certain amount of seed money to initiate research. On the basis of submitted proposals, things will materialise and start, he added.

Dr. Singh also told Financial Express.com that the new hospital will also be empaneled. There is a process of 3-6 months and then after medical facilities will be available under all panels like ECHS, CGHS and other TPAs, he added.

During the press conference, Dr Singh also informed that the hospital will be among the very few facilities in the country to conduct hand transplants, a specialty pioneered by Amrita Hospital in Kochi. We will also do transplants of liver, kidney, trachea, vocal cords, intestine, heart, lung, pancreas, skin, bone, face and bone marrow, he said.

Training of medical students and doctors will be a strong focus area. The hospital will have state-of-the-art robotics, haptic, surgical-medical simulation centre spread across 4 floors and 1.5 lakh sq. ft area, the biggest such learning & development facility for doctors in the country. The facility will also host a medical college and the countrys biggest allied health sciences campus, he stated.

Moreover, the management also informed that ultra-modern Amrita Hospital at Faridabad would be one of Indias largest green-building healthcare projects with a low carbon footprint. It is an end-to-end paperless facility, with zero waste discharge.

There is also a helipad on the campus for swift transport of patients and a 498-room guest house where attendants accompanying the patients can stay, they said.

Read the rest here:
New Amrita Hospital is all set to open in Faridabad in August this year; 2,400-bed facility will become Indias biggest private hospital - The...

Should University Agricultural Research Scientists Partner With Industry? – Genetic Literacy Project

Paul Vincelli, extension professor and Provosts Distinguished Service Professor at the University of Kentucky| March 7, 2017

HIGHLIGHTS:

Biases, conflicts of interest come from many sources, including associations with industry, advocacy groups, other non-profits Industry funding of studies on GE crops does not appear to be important bias source Personal experience suggests corporations receptive to negative results, as they improve products, limit liability Limited resources for much agricultural research without industry support Dubious shill accusations against biotech scientists discourage public engagement, depress discourse

Agricultural scientists who interact with the public often feel under enormous scrutiny. One of the most common concerns is that professional ties with industryespecially obtaining funding from industrycompromise scientific credibility. This concern is particularly acute in the area of genetically engineered crops (GE crops, commonly known as GMOs).

Research into genetically engineered crops is not my specialtymy work is focused on plant pathologyand I have never solicited nor received private-sector funding on this issue. Over my career, my industry interactions have dealt with non-GMO products for plant disease control. My interest in GE crops arises from their potential to address genuine human needs and to reduce the environmental footprint of agriculture. And I am concerned that a dark shadow has been cast over many independent scientists because of their collaborative efforts with various stakeholders, including companies.

Biases From Many Sources

Across multiple disciplines, industry-funded projects may be more likely to report positive outcomes, or less likely to report negative outcomes [1-4]. However, industry funding is not always associated with biased outcomes [5, 6]. Furthermore, many sources of funding NGOs, non-profits, other civil and governmental organizationsmay engender conflicts of interest (COIs) and biases that influence reported research. Powerful biases may arise for non-monetary reasons [7] in both researchers and in non-researcherspossibly including you and me.

Regarding GE crops, I am aware of three journal articles on the topic of industry funding and bias. In the first [8], the authors found no evidence of bias due to financial COIs (studies sponsored by an industry source that may benefit from the outcome), but they did document bias associated with professional COIs (where at least one author was affiliated with a company that could benefit from the study outcome). In that study, among the 70 studies examined (see their Table 2), 61% had either a financial or a professional COI. Among the much larger sample size (698 studies) examined by Sanchez [9], the majority had no COI, and only one quarter had COIs related to author affiliation and/or declared funding source.

A recent study by Guillemaud et al [10] had similar findings: among 579 studies with definitive COI information (see their Figure 3), the majority did not report a COI. However, among those with COIs, there was a higher probability of reported outcomes favorable to the GE crop industry. In addition to these journal articles, another independent analysis [11] suggested that industry funding did not bias study outcomes for GE crops, but these data have not been analyzed statistically nor published in a peer-reviewed journal.

Thus, while evidence to date shows that the majority of studies on GE crops are not influenced by COIs, some fraction is so influenced. Therefore, there is value in remaining alert to the possibility of bias and in continuing to practice full disclosure. I believe it is important to remain alert to COIs and biases of all sortsnot only those associated with corporate influences, but also those of NGOs or other civil organizations that may have explicit or implicit agendas.

Some people simply do not trust corporations. This is understandable, given the indefensible behavior of some in business, such as the tobacco industry, the chemical industry, Exxon, and Volkswagen [12-15]. Consequently, some members of the public perfunctorily dismiss commercial-sector scientists who may have solid scientific skills and high personal integrity. I personally must admit to a measure of distrust of corporations, which may even express itself occasionally as an anti-industry bias. But I also believe it is unwise to categorically reject all industry-funded data, solely on the basis of their provenance. In fact, I would label such an attitude a bias itself. Thoughtful, evidence-based analysis must always trump bias and ideologyand does, for a good scientist.

Why do researchers accept industry funding? Public-sector and private-sector scientists may share common interests. Industry scientists and I share a common interest in knowing what works in the field and what doesnt. Consequently, industry sources provide funding for field tests of their products for plant disease control. Furthermore, public funding for science in the USA is insufficient to support even a fraction of the worthy research projects. Inadequate funding can quickly and thoroughly undercut a career in science at any stage. Since researchers are hired to do research on important topics and not to whine about the difficult state of public funding, some will welcome funding from commercial sources, if it allows them to continue to do research they believe is intellectually compelling, important to society, or both. Also, industry scientists may have knowledge, skills, and facilities that we public scientists may not.

My Funding Choices: Scientific Rigor Coupled With Personal Integrity

Discussing my own practices should provide an idea of how many scientists work. Roughly half of my funding over the years has been from industry, primarily to support product testing for plant disease control. I have commonly tested synthetic fungicides, but I have also tested natural products of various sorts. In fact, commercial pesticide manufacturers can fairly accuse me of an anti-pesticide bias. I say this because I have tended to favor testing products that might be perceived as more consistent with sustainability (biocontrol products, for example) than applications of synthetic chemicals, often requesting limited, or no, funding for such tests. Besides industry funding, I have received federal funds for research and outreach on detection and management of plant diseases.

I publish all efficacy trials in Plant Disease Management Reports. We commonly publish data showing inadequate efficacy or phytotoxicity, and I never consider funding sources when the report is drafted. In fact, the reports are drafted by the Senior Research Analyst who conducts the field work, and he doesnt know who provided funding nor for what amount. Thus, our testing program does not suffer from publication bias. This approach is not exceptional [16, 17].

I accept no personal giftsmonetary or materialfrom private-sector sources.

I have no hesitation about challenging multinational corporations. For example, I provided a degree of national leadership in challenging a major pesticide manufacturer over certain uses of a commercial crop fungicide. I was one of the lead authors of a letter to the US Environmental Protection Agency raising questions about the paucity of public data to support plant health claims. I gave a similar talk in a major scientific conference, the 2009 American Phytopathological Society meeting.

Several factors may help me and other scientists to offset natural human tendencies towards bias:

A common concern is that providing funding buys access to researchers. This may sometimes be the case, but for me, this criticism doesnt fit. I am an Extension Specialist everybody has access to me and my expertise. I dont recall a single instance in my entire career when I failed to return a phone call or email from anyone. In fact, it is a federal requirement that Extension programming be grounded in engagement with diverse stake- holdersincluding,
but certainly not limited to, industry [18].

What Happens When Data Fall Short Of Company Expectations?

We regularly see poor product performance in our experiments. In a memorable instance, we observed visible injury to a creeping bentgrass putting green from a particular formulation of the widely used fungicide, chlorothalonil. On the day of application, the turfgrass was suffering exceptionally severe drought stress, due to an irrigation equipment failure, which probably was a predisposing factor.

I notified the company of my observations, which is my standard practice if a product provides unexpectedly poor performance or unexpected phytotoxicity. This is not to provide the company the opportunity to help me see the error of my ways. Rather, this is simply good scientific practice. I want industry scientists to collect their own samples, so that they may better understand the poor results obtained; and to offer hypotheses or insights that may account for the unexpected results, as they often know things about their product and its performance that I do not.

In the case of the turfgrass injury caused by chlorothalonil, a company representative and I visited the experiment together and shared observations. I listened to the representatives hypotheses and shared my own. After the meeting and additional lab work, I reported my findings in various outlets. In my research program, unfavorable results get reported no differently than favorable results.

I must state emphatically that, in my 34 years of product testing for plant disease control, I cannot recall a single instance where a company representative attempted to pressure me to report favorable results. Company representatives do not like to receive bad news, but in my experience, almost every company representative I have interacted with has been professional enough to recognize the importance of discovering the limitations of their products sooner rather than later. The consequences of introducing an inadequate product can be catastrophic for a corporation.

Corporate Funding for Outreach

What about private-sector funding for outreach? To my knowledge, such funds are never provided with a quid pro quo that the scientist will make particular claims about a companys products. To the contrary, private-sector representatives take note of speakers whose scientific understanding is consistent with their own. They may approach those speakers to discuss possible support for outreach, but without specifying the content of such presentations. Although I refuse industry funding for all aspects of GE crops, I do not suspect undue industry influence when funds are provided for travel expenses or supplies of invited speakers. Even honoraria or stipends for speaking engagements dont particularly concern me. This is true for such funding across the full spectrum of possible funding sources, ranging from advocacy groups for organic agriculture to multinational pesticide manufacturers. I want to see the scientific methods and data, no matter who did the study.

Who Should Pay For Research?

Should publicly funded professors even do product testing? Yes: there is a public interest in independent assessments of how products perform. The more public data on performance, the better.

If you agree that third-party testing is desirable, the question arises, Who pays for it? I believe that, usually, the manufacturer is responsible, not the taxpayer. Of course, this raises concern about funding bias. If a researcher wishes to avoid funding bias, can they tap into other sources? Not in my discipline. Pools of public funding for product testing are essentially non-existent.

What about studies of possible impacts of products to the environment? Who should pay for that? Again, in my opinion, such costs fall to the manufacturer, although in some cases, there is a compelling public interest that justifies the use of public funds for product testing.

Final thoughts: Does industry-researcher cooperation undermine the credibility of scientific research?

For me, the answer is, No. We should be cognizant of possible biases and COIs due to source of fundingwhether the source is industry, NGOs, advocacy organizations, or other sources. Disclosure is critical [7, 19]. However, industry scientists are often excellent scientists who take pride in their work, no differently than any industry critic. Yes, we should exercise a degree of caution when reviewing industry-funded research, but the same holds for research funded by advocacy organizations, since each has an agenda. Personally, in all cases, I will not reject either source out of hand; I will judge the work based on its scientific merit.

Sometimes the bias against industry-funded research on GE becomes hurtful, especially in the social media. Witnessing dedicated public servants being unfairly attacked as industry shills is demoralizing to public scientists, and it has the unintended consequence of discouraging public engagement by scientists who already have very busy professional and personal lives. Such unfounded charges are not only divisive and unproductive: they are unkind and can be abusive. (Sadly, unkind behavior can be found in all sides of the GMO debate.)

My freedom from industry funding on all aspects of GE protects me from similar accusations. Yet it doesnt surprise good scientists that, after years of studying the scientific literature, I independently arrived at an understanding very similar to that presented in the re- port of the National Academy of Sciences, Engineering and Medicine (NASEM) published earlier this year [20]. This isnt because industry has somehow influenced me or the members of the NASEM review committee. It is because there is a substantial body of credible science supporting the conclusions presented in the NASEM report. In reviewing the body of peer-reviewed scientific literature on GE crops, one is likely to arrive at similar conclusions. I had an identical experience with the scientific consensus on climate change [21].

Ultimately, with enough careful study of evidence from credible sources, fidelity to good scientific practice, and a degree of humility, it is hard not to arrive at findings rather similar to those of journal-published experts of a scientific discipline. They actually do know something about their subject after all.

Paul Vincelli is an Extension Professor and Provosts Distinguished Service Professor at the University of Kentucky. Over the 26 at UK, he has developed specializations in management of diseases of corn, forages, and turfgrasses, molecular diagnostics, and international agriculture. He also has provided Extension programming on climate change and on genetic engineering of crops. He currently is UKs Coordinator for the USDAs Sustainable Agriculture Research and Education program, and he serves as Councilor-At-Large for the American Phytopathological Society.

The Genetic Literacy Project is a 501(c)(3) non profit dedicated to helping the public, journalists, policy makers and scientists better communicate the advances and ethical and technological challenges ushered in by the biotechnology and genetics revolution, addressing both human genetics and food and farming. We are one of two websites overseen by the Science Literacy Project; our sister site, the Epigenetics Literacy Project, addresses the challenges surrounding emerging data-rich technologies.

Acknowledgements

Thanks are expressed to John R. Hartman and Jon Entine, for reviewing earlier drafts of the manuscript.

Disclosure Statement

The author declares no conflicts of interest in the topic of GE crops. Detailed disclosure documents may be found here. The author donated the full amount of his monetary honorarium for writing this article to Human Rights Watch.

Literature Cited

1. Bes-Rastrollo, M., Schulze, M. B., Ruiz-Canela, M. and Martinez-Gonzalez, M. A., Financial conflicts of interest and reporting bias regarding the association between sugar-sweetened beverages and weight gain: A systematic review o
f systematic reviews. PLoS Medicine, 2013, Vol. 10, p. e1001578, DOI: 10.1371/ journal.pmed.1001578. Available from: http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.1001578

2. Lesser, L. I., Ebbeling, C. B., Goozner, M., Wypij, D. and Ludwig, D. S., Relationship between funding source and conclusion among nutrition-related scientific articles. PLoS Medicine, 2007, Vol. 4, p. e5, DOI: 10.1371/journal. pmed.0040005. Available from: http://journals.plos.org/plosmedicine/article?id=10.1371/journal.pmed.0040005

3. Vera-Badillo, F. E., Shapiro, R., Ocana, A., Amir, E. and Tannock, I. F., Bias in reporting of end points of efficacy and toxicity in randomized, clinical trials for women with breast cancer. Ann Oncol, 2013, Vol. 24, p. 1238-44, DOI: 10.1093/annonc/mds636. Available from: http://www.ncbi.nlm.nih.gov/pubmed/23303339

4. Landefeld, C. S., Commercial support and bias in pharmaceutical research. Am J Med, 2004, Vol. 117, p. 876-8, DOI: 10.1016/j.amjmed.2004.10.001. Available from: http://www.ncbi.nlm.nih.gov/pubmed/15589496

5. Barden, J., Derry, S., McQuay, H. J. and Moore, R. A., Bias from industry trial funding? A framework, a suggested approach, and a negative result. Pain, 2006, Vol. 121, p. 207-18, DOI: 10.1016/j.pain.2005.12.011. Available from: http://www.ncbi.nlm.nih.gov/pubmed/16495012

6. Kaplan, R. M. and Irvin, V. L., Likelihood of null effects of large NHLBI clinical trials has increased over time. PLoS One, 2015, Vol. 10, p. e0132382, DOI: 10.1371/journal.pone.0132382. Available from: http://www.ncbi.nlm.nih.gov/pubmed/26244868

7. Young, S. N., Bias in the research literature and conflict of interest: an issue for publishers, editors, reviewers and authors, and it is not just about the money. Journal of Psychiatry and Neuroscience, 2009, Vol. 34, p. 412-417. Available from: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2783432/

8. Diels, J., Cunha, M., Manaia, C., Sabugosa-Madeira, B. and Silva, M., Association of financial or professional conflict of interest to research outcomes on health risks or nutritional assessment studies of genetically modified products. Food Policy, 2011, Vol. 36, p. 197-203, DOI: 10.1016/j.foodpol.2010.11.016. Available from: http://www.sciencedirect.com/science/article/pii/S0306919210001302

9. Sanchez, M. A., Conflict of interests and evidence base for GM crops food/feed safety research. Nat Biotechnol, 2015, Vol. 33, p. 135-7, DOI: 10.1038/nbt.3133. Available from:http://www.ncbi.nlm.nih.gov/pubmed/25658276

10. Guillemaud, T., Lombaert, E. and Bourguet, D., Conflicts of interest in GM Bt crop efficacy and durability studies. PLoS One, 2016, Vol. 11, p. e0167777, DOI: 10.1371/journal.pone.0167777. Available from:https://www.ncbi.nlm.nih.gov/pubmed/27977705

11. Brazeau, M., GM Food is Safe According to Independent Studies, in Cosmos. 2014. Available from: https://cosmosmagazine.com/biology/gm-food-safe-according-independent-studies.

12. Gates, G., Ewing, J., Russell, K. and Watkins, D. Explaining Volkswagens Emissions Scandal. New York Times. 19 Jul 2016. Available from: http://www.nytimes.com/interactive/2015/business/international/vw-diesel-emissions-scandal-explained.html.

13. Markowitz, G. and Rosner, D., Deceit and Denial: The Deadly Politics of Industrial Pollution, With a New Epilogue. 2013, California/Milbank Books on Health and the Public. 446 pp. pp, ISBN 9780520275829.

14. Ingram, D. Judge Orders Tobacco Companies to Admit Deception. Reuters News Agency. 27 Nov 2012. Available from: http://www.reuters.com/article/us-usa-tobacco-idUSBRE8AQ18A20121128.

15. Banerjee, N., Lisa Song, L. and Hasemyer, D., Exxons Own Research Confirmed Fossil Fuels Role in Global Warming Decades Ago. 2015 16 Sep 2015 [Accessed 18 Aug 2016]; Available from: https://insideclimatenews.org/content/Exxon-The-Road-Not-Taken.

16. Lipsky, P. E., Bias, conflict of interest and publishing. Nature Reviews Rheumatology, 2009, Vol. 5, p. 175-176, DOI: 10.1038/nrrheum.2009.52. Available from: http://www.nature.com/nrrheum/journal/v5/n4/full/nrrheum.2009.52.html

17. Kniss, A., I Am Biased and So Are You: Thoughts on Funding and Influence in Science. 2016 [Accessed 18 Aug 2016]; Available from: http://weedcontrolfreaks.com/2015/08/i-am-biased-and-so-are-you-thoughts-on-funding-and-influence-in-science/.

18. Kelsey, K. D. and Mariger, S. C., A case study of stakeholder needs for Extension education Journal of Extension, 2002, Vol. 40, 2RIB2. Available from: http://www.joe.org/joe/2002april/rb2.php

19. Lewandowsky, S. and Bishop, D., Dont let transparency damage science. Nature, 2016, Vol. 529, p. 459-461. Available from: http://www.nature.com/news/research-integrity-don-t-let-transparency-damage-science-1.19219

20. National Academies Press. Genetically Engineered Crops: Experiences and Prospects. 2016, ISBN 978-0-309-43738-7. Washington DC. 420 pp. Available from: http://www.nap.edu/23395Accessed18May2016.

21. Vincelli, P., Scientific consensus as a foundation for Extension programming. Journal of Extension, 2015, Vol. 53, 1COM2. Available from: http://www.joe.org/joe/2015february/comm2.php

The rest is here:
Should University Agricultural Research Scientists Partner With Industry? - Genetic Literacy Project

India’s GM Crops Regulation Should Be Based on a Gene’s Effects, Not Its Source The Wire Science – The Wire Science

Representative photos of cotton and brinjal: Wikimedia Commons, CC BY

India has a long and dubious record of regulating genetically altered crops for agriculture. While the nation began at the same time as many other countries with the same ambitious goals to deploy new genetic engineering tools to address agricultural vulnerabilities it has fallen behind. Only one crop, modified with molecular techniques pest-resistant cotton has been approved by regulators.

In an attempt to expand farmers access to genetically engineered crops, in March of this year, the Indian government exempted crops with certain kinds of genetic modifications introduced by genome editing (also known as gene editing) from the cumbersome and time-consuming regulations previously imposed on the commercialisation of all crops genetically modified with molecular techniques.

Specifically (and as explained in more detail below), the new policy exempts crops with simple tweaks to genes that are already natural to the plant but that have not had any foreign DNA added. This approach may be expedient but it is not scientifically sound.

Bt cotton and Bt brinjal

Genetically modified cotton came first to India because of its economic importance and environmental externalities. Specifically, Bt cotton was the first product in the country modified with modern molecular genetic techniques. However, it sparked fierce political debate instigated by internationally visible but misguided activists.

Bt is shorthand for Bacillus thuringiensis, a bacterium found mainly in the soil that produces proteins toxic to some insects, especially the cotton bollworm. The genes that express these proteins were introduced by recombinant DNA technology a.k.a. gene splicing into the genome of various crop plants to protect them from pests.

Bt cotton soon became ensnared in spurious societal battles around neo-colonialism, the purported evils of Monsanto, organic agriculture and farmers suicides. It was officially regulated and socially stigmatised as a GMO, short for genetically modified organism. After 10 years, it remains Indias only approved genetically engineered crop.

The Indian versions of insect-resistant Bt-cotton proved highly successful in controlling the bollworm that had ravaged cotton crops. They contained only one transgene, or a gene introduced from an unrelated organism, for one trait and for only a single species of bollworm. Yet, because of the presence of this single newly introduced gene, this first successful application of molecular genetic engineering in Indian agriculture was subjected to a long and costly development process.

Herbicide tolerance as a weed-control trait also proved popular, although it was never approved and therefore its cultivation was, and is, illegal.

At the same time, farmers demand in underground markets moved the transgenic frontier forward in a poorly regulated and awkward way. Farmers vote with their ploughs, and many officials lack the knowledge and/or the incentives to contest illegal plantings.

Also read: Does India Need Transgenic Mustard?

The biggest flaw in Indias cumbersome and poorly understood regulatory system emerged vividly with the introduction of a second genetically engineered crop candidate: Bt brinjal1, a staple of some of the worlds poorest rural populations.

Brinjal in India is attacked by a boring insect larva (Leucinodes orbonalis) that is susceptible to the same protein as the cotton bollworm. But as with cotton, there is no naturally occurring gene in the brinjal family tree that conventional breeding could utilise. This is why researchers introduced the Bt gene into a brinjal variety, thus rendering it a transgenic organism.

Brinjal is not extensively traded internationally but is very important for small farmers income and both local and national consumption. There is also no environmentally acceptable, effective alternative for farmers to use as insecticides against brinjal pests.

Field trials of the transgenic brinjal cultivars were extremely promising, even compared to the successes of Bt cotton. The fact that the transgene and the cultivars were both indigenous also suggested that the variety would be nationally acceptable in a way that Bt cotton couldnt be.

The Genetic Engineering Approval Committee of India approved Bt brinjal but it was vetoed in 2010 by the then-environment-minister, Jairam Ramesh. It has since been stuck in regulatory limbo in India. During this time, India donated the genetic event EE12 to Bangladesh and the Philippines.

After EE1 was introduced into Bangladeshi varieties of eggplant and tested, the government approved them and they have been extremely successful. Interestingly, some of the altered brinjal has spread to India, and is found growing happily in India but on an unknown scale and unapproved by bureaucrats.

Regulatory discrimination

Both Bt cotton and Bt brinjal in India tell the same story: that advances for farmers unavailable through conventional, pre-molecular plant-breeding techniques have proved useful not panaceas but incrementally beneficial, trait by trait, with more in the pipeline. However, the regulatory system is slow, unscientific, inconsistent and obstructionist. Its concerns often reflect more urban politics and the blandishments of activists than farmers interests.

Nonetheless, there is hope that the most recent advances in the seamless continuum of genetic modification of plants represented by genome editing will fare better. These techniques allow genetic material to be added, removed or altered at specific locations in the genome.

The best known of these techniques is CRISPR-Cas9. This system is faster, cheaper, more precise and more efficient than earlier genome editing methods. It is also more democratic, by being less dependent on the political heft and huge resources of the multinational plant science corporations. Innovation is thus often centred in universities and individual research teams.

This said, if genome editing is to live up its potential, its regulation will need to be scientifically defensible and risk-based.

This is why the UK has reconsidered its highly prohibitive stance on molecular genetic engineering. Even the generally anti-genetic engineering EU is discussing a revised legal framework that incorporates genome editing. Consistent with this global trend, in March 2022, India announced that it would exempt certain categories of genome-edited crops from regulatory oversight.

As part of this, it has categorised genome-edited alterations as SDN-1, SDN-2 and SDN-3 (SDN stands for site-directed nuclease3). Variants made using SDN-1 and SDN-2 involve simply tweaking particular traits that already exist in a genome whereas SDN-3 involves the insertion of genes from external, or foreign, sources. So making brinjal resistant to insect predators by introducing genes from B. thuringiensis would put it in the SDN-3 category.

India has announced that SDN-1 and SDN-2 will be regulated as non-genetically engineered organisms, as there are no distinguishable sequence changes made between them and those resulting from conventional crop breeding. SDN-3, however, which involves the incorporation of a foreign DNA sequence, will continue to be heavily regulated.

This approach to regulation is unscientific and short-sighted. It has no demonstrated connection to enhanced risk. Instead, the SDN categories are based simply on considerations of how close to nature the new constructions are. Bt cotton, which was introduced to India over 20 years ago and has transformed Indias economy, will be classified as an SDN-3 crop as will Bt brinjal. So as such, the latter looks set to remain stuck in the regulatory quagmire it has been in since the beginning of its development.

There is no scientific rationale for a regulatory policy that distinguishes SDN-3 crops from SDN-1 and -2 crops. The difference between these categories is determined by the presence or absence of a foreign
gene, but the term foreign has many connotations, none of which is meaningful for regulation in the current context.

Through advances in genome sequencing, we now know that foreign genes i.e. genes that originated in an unrelated organism are present in many crop plants. They may be thought of as natural GMOs. From sweet potato to several species of grass, genes from unrelated organisms have found their way into the most unexpected places.

Also read: The Strange Case of Indias First Public-Sector Bt Cotton Variety

Failed tests

What matters from a risk and therefore regulatory perspective is not the source of a gene but its function and its effect on phenotype4. A construct that results from the addition of a foreign gene via molecular techniques should not be held to a different standard or subjected to a more stringent regulatory regime unless the modification could in some way increase risk.

Baseless regulatory discrimination against transgenic i.e. SDN-3 crops means that some new varieties that could drastically improve the fortunes of resource-poor people and environmentally vulnerable places will, for practical purposes, remain proscribed and unavailable except through the stealth practices of farmers.

The regulatory policies of the governments of India, the EU and many other countries fail this test of scientific logic. The regulation of molecular genetic engineering has been based more on political considerations than on sound science, and as such cripples progress.

Flawed regulation is not the only problem related to genetically engineered crops in India. Another is the chronic lack of transparency about agricultural technology generally and genetic engineering in particular. Data that supports government policies and specific regulatory decisions have been consistently and conspicuously lacking from government sources. That stokes public suspicion about incompetence or even corruption.

That is unfortunate and puzzling, because there is plenty of evidence they could cite. We have more than 20 years of data on commercialised genetically engineered crops worldwide. It is very clear that they are as safe as, or in some cases safer than, crops from other breeding methods. Put another way, there is no evidence that the use of molecular genetic engineering techniques confers unique or incremental risks.

The European Academies Science Advisory Council said in 2013, There is no valid evidence that [genetically engineered] crops have greater adverse impact on health and the environment than any other technology used in plant breeding. Even the WHO a component of the notoriously risk-averse UN agrees: it said in a 2014 report that

[genetically engineered] foods currently available on the international market have passed safety assessments and are not likely to present risks for human health. In addition, no effects on human health have been shown as a result of the consumption of such foods by the general population in the countries where they have been approved.

Literally hundreds of other analyses by governmental and professional groups have echoed these findings.

Genome editing is both a continuation of plant modifications humans have depended on for millennia and a promising new frontier. Nevertheless, striking a balance between too little and much caution is not difficult, given the great precision and predictability of newer molecular techniques. Science shows the way, and politicians and regulators everywhere should follow it.

Henry I. Miller is a physician and molecular biologist and a senior fellow at the Pacific Research Institute. He was the founding director of the FDAs Office of Biotechnology and a consulting professor at Stanford Universitys Institute for International Studies. Kathleen L. Hefferon teaches microbiology at Cornell University. Ronald Herring is emeritus professor of government and International Professor of agriculture and rural development at Cornell University.

Continue reading here:
India's GM Crops Regulation Should Be Based on a Gene's Effects, Not Its Source The Wire Science - The Wire Science

Tenaya Therapeutics Launches Operations of New Genetic Medicines Manufacturing Center to Support the Development of Potentially First-In-Class…

Facility to Provide Clinical Supply of Lead Gene Therapy Programs TN-201 and TN-401 for Planned First-in-Human Studies

94,000 sq. ft. Modular Facility has Initial Production Capacity at the 1000L Scale

SOUTH SAN FRANCISCO, Calif., June 16, 2022--(BUSINESS WIRE)--Tenaya Therapeutics, Inc. (NASDAQ: TNYA), a biotechnology company with a mission to discover, develop and deliver curative therapies that address the underlying causes of heart disease, today announced that it has completed the build-out and operational launch of its Genetic Medicines Manufacturing Center in Union City, California. Tenaya is advancing a pipeline of therapeutic candidates, including several adeno-associated virus (AAV) gene therapies, for the potential treatment of both rare and prevalent forms of heart disease.

This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20220616005336/en/

Tenayas Genetic Medicines Manufacturing Center located in Union City, CA (Photo: Business Wire)

"Tenaya made an early, strategic commitment to internalize several core capabilities to optimize the safety, efficacy, and supply of our product candidates on behalf of patients. With todays announcement we have made a big leap forward on that commitment by establishing end-to-end in-house manufacturing capabilities for our pipeline of AAV-based gene therapies," said Faraz Ali, Chief Executive Officer of Tenaya. "The operational launch of Tenayas Genetic Medicines Manufacturing Center represents an important milestone as we prepare to advance our robust pipeline of potentially first-in-class cardiovascular therapies into initial clinical studies."

Tenayas Genetic Medicines Manufacturing Center is designed to meet regulatory requirements for production of AAV gene therapies from discovery through commercialization under Current Good Manufacturing Practice (cGMP) standards. Initial production efforts will support first-in-human studies of Tenayas lead gene therapy, TN-201. TN-201 is being developed for the treatment of genetic hypertrophic cardiomyopathy (HCM) due to MYBPC3 gene mutations. Tenaya plans to submit an Investigational New Drug (IND) application for TN-201 to the U.S. Food and Drug Administration (FDA) in the second half of this year. The facility will also support cGMP production for TN-401, Tenayas gene therapy program being developed for the treatment of genetic arrhythmogenic right ventricular cardiomyopathy (ARVC) due to PKP2 gene mutations, for which the company plans to submit an IND to the FDA in 2023.

Story continues

"The investment in our own world-class manufacturing facility provides Tenaya with greater control over product attributes, quality, production timelines and costs, which we believe will ultimately translate into better treatments for patients," said Kee-Hong Kim, Ph.D., Chief Technology Officer of Tenaya Therapeutics. "Tenayas Genetic Medicines Manufacturing Center complements our established internal genetic engineering and drug discovery capabilities and is designed to meet our near- and long-term needs such that we can readily scale and expand as our pipeline matures and evolves."

Tenaya completed customization of approximately half of the 94,000 square foot facility to incorporate manufacturing suites and labs, office space and storage. Utilizing a modular design, the state-of-the-art facility is now fully operational with initial capacity to produce AAV-based gene therapies at the 1000L scale, utilizing Tenayas proprietary baculovirus-based production platform and suspension Sf9 cell culture system. The excess space and modular design of the Genetic Medicines Manufacturing Center is intended to provide Tenaya with considerable flexibility to expand manufacturing capacity by increasing both the number and the scale of bioreactors to meet future clinical and commercial production needs.

The Union City location, approximately 30 miles from Tenayas South San Francisco headquarters, is expected to enable the seamless transition of Tenayas science from early research through commercial manufacturing. The selection of this location is intended to foster a culture of close collaboration across teams at all stages of developing and testing novel AAV capsids, de-risk the translation from research to process development and create opportunities for improvements in production processes. The Genetic Medicines Manufacturing Center is staffed by a growing in-house team with expertise in all aspects of gene therapy manufacture, including process development, analytical development, quality assurance and quality control.

About Tenaya Therapeutics

Tenaya Therapeutics is a biotechnology company committed to a bold mission: to discover, develop and deliver curative therapies that address the underlying drivers of heart disease. Founded by leading cardiovascular scientists from Gladstone Institutes and the University of Texas Southwestern Medical Center, Tenaya is developing therapies for rare genetic cardiovascular disorders, as well as for more prevalent heart conditions, through three distinct but interrelated product platforms: Gene Therapy, Cellular Regeneration and Precision Medicine. For more information, visit http://www.tenayatherapeutics.com.

Forward Looking Statements

This press release contains forward-looking statements as that term is defined in Section 27A of the Securities Act of 1933 and Section 21E of the Securities Exchange Act of 1934. Statements in this press release that are not purely historical are forward-looking statements. Words such as "potential," "will," "plans," "believe," "expected," and similar expressions are intended to identify forward-looking statements. Such forward-looking statements include, among other things, statements regarding the therapeutic potential of Tenayas pipeline of therapeutic candidates; Tenayas plan to use the cGMP manufacturing facility for the production of TN-201 and TN-401; Tenayas belief that its cGMP manufacturing facility will enable seamless transition from early research through commercial manufacturing and translate into better treatments for patients; the expected timing for submission of IND applications for TN-201 and TN-401; and statements by Tenayas chief executive officer and chief technology officer. The forward-looking statements contained herein are based upon Tenayas current expectations and involve assumptions that may never materialize or may prove to be incorrect. These forward-looking statements are neither promises nor guarantees and are subject to a variety of risks and uncertainties, including but not limited to: risks associated with the process of discovering, developing and commercializing drugs that are safe and effective for use as human therapeutics and operating as an early stage company; Tenayas ability to successfully manufacture product candidates in a timely and sufficient manner that is compliant with regulatory requirements; Tenayas ability to develop, initiate or complete preclinical studies and clinical trials, and obtain approvals, for any of its product candidates; the timing, progress and results of preclinical studies for TN-201, TN-401 and Tenayas other programs; Tenayas ability to raise any additional funding it will need to continue to pursue its business and product development plans; negative impacts of the COVID-19 pandemic on Tenayas manufacturing and operations, including preclinical studies and planned clinical trials; the timing, scope and likelihood of regulatory filings and approvals; the potential for any clinical trial results to differ from preclinical, interim, preliminary, topline or expected results; Tenayas manufacturing, commercialization and marketing capabilities and strategy; the loss of key scientific or management personnel; competition in the industry in which Tenaya operates; Tenayas reliance on third parties; Tenayas ability to obtain and maintain intellectual property protection for its product candidates; general economic and market conditions; and other risks. Information regardi
ng the foregoing and additional risks may be found in the section entitled "Risk Factors" in documents that Tenaya files from time to time with the Securities and Exchange Commission. These forward-looking statements are made as of the date of this press release, and Tenaya assumes no obligation to update or revise any forward-looking statements, whether as a result of new information, future events or otherwise, except as required by law.

View source version on businesswire.com: https://www.businesswire.com/news/home/20220616005336/en/

Contacts

Investors Michelle CorralVice President, Investor Relationship and Corporate CommunicationsTenaya TherapeuticsIR@tenayathera.com

Media Wendy RyanTen Bridge CommunicationsWendy@tenbridgecommunications.com

Read more here:
Tenaya Therapeutics Launches Operations of New Genetic Medicines Manufacturing Center to Support the Development of Potentially First-In-Class...

Quantum computing: Definition, facts & uses | Live Science

Quantum computing is a new generation of technology that involves a type of computer 158 million times faster than the most sophisticated supercomputer we have in the world today. It is a device so powerful that it could do in four minutes what it would take a traditional supercomputer 10,000 years to accomplish.

For decades, our computers have all been built around the same design. Whether it is the huge machines at NASA, or your laptop at home, they are all essentially just glorified calculators, but crucially they can only do one thing at a time.

The key to the way all computers work is that they process and store information made of binary digits called bits. These bits only have two possible values, a one or a zero. It is these numbers that create binary code, which a computer needs to read in order to carry out a specific task, according to the book Fundamentals of Computers (opens in new tab).

Quantum theory is a branch of physics which deals in the tiny world of atoms and the smaller (subatomic) particles inside them, according to the journal Documenta Mathematica (opens in new tab). When you delve into this minuscule world, the laws of physics are very different to what we see around us. For instance, quantum particles can exist in multiple states at the same time. This is known as superposition.

Instead of bits, quantum computers use something called quantum bits, 'qubits' for short. While a traditional bit can only be a one or a zero, a qubit can be a one, a zero or it can be both at the same time, according to a paper published from IEEE International Conference on Big Data (opens in new tab).

This means that a quantum computer does not have to wait for one process to end before it can begin another, it can do them at the same time.

Imagine you had lots of doors which were all locked except for one, and you needed to find out which one was open. A traditional computer would keep trying each door, one after the other, until it found the one which was unlocked. It might take five minutes, it might take a million years, depending on how many doors there were. But a quantum computer could try all the doors at once. This is what makes them so much faster.

As well as superposition, quantum particles also exhibit another strange behaviour called entanglement which also makes this tech so potentially ground-breaking. When two quantum particles are entangled, they form a connection to each other no matter how far apart they are. When you alter one, the other responds the same way even if they're thousands of miles apart. Einstein called this particle property "spooky action at a distance", according to the journal Nature (opens in new tab).

As well as speed, another advantage quantum computers have over traditional computers is size. According to Moore's Law, computing power doubles roughly every two years, according to the journal IEEE Annals of the History of Computing (opens in new tab). But in order to enable this, engineers have to fit more and more transistors onto a circuit board. A transistor is like a microscopic light switch which can be either off or on. This is how a computer processes a zero or a one that you find in binary code.

To solve more complex problems, you need more of those transistors. But no matter how small you make them there's only so many you can fit onto a circuit board. So what does that mean? It means sooner or later, traditional computers are going to be as smart as we can possibly make them, according to the Young Scientists Journal (opens in new tab). That is where quantum machines can change things.

The quest to build quantum computers has turned into something of a global race, with some of the biggest companies and indeed governments on the planet vying to push the technology ever further, prompting a rise in interest in quantum computing stocks on the money markets.

One example is the device created by D-Wave. It has built the Advantage system which it says is the first and only quantum computer designed for business use, according to a press release (opens in new tab) from the company.

D-wave said it has been designed with a new processor architecture with over 5,000 qubits and 15-way qubit connectivity, which it said enables companies to solve their largest and most complex business problems.

The firm claims the machine is the first and only quantum computer that enables customers to develop and run real-world, in-production quantum applications at scale in the cloud. The firm said the Advantage is 30 times faster and delivers equal or better solutions 94% of the time compared to its previous generation system.

But despite the huge, theoretical computational power of quantum computers, there is no need to consign your old laptop to the wheelie bin just yet. Conventional computers will still have a role to play in any new era, and are far more suited to everyday tasks such as spreadsheets, emailing and word processing, according to Quantum Computing Inc. (QCI) (opens in new tab).

Where quantum computing could really bring about radical change though is in predictive analytics. Because a quantum computer can make analyses and predictions at breakneck speeds, it would be able to predict weather patterns and perform traffic modelling, things where there are millions if not billions of variables that are constantly changing.

Standard computers can do what they are told well enough if they are fed the right computer programme by a human. But when it comes to predicting things, they are not so smart. This is why the weather forecast is not always accurate. There are too many variables, too many things changing too quickly for any conventional computer to keep up.

Because of their limitations, there are some computations which an ordinary computer may never be able to solve, or it might take literally a billion years. Not much good if you need a quick prediction or piece of analysis.

But a quantum computer is so fast, almost infinitely so, that it could respond to changing information quickly and examine a limitless number of outcomes and permutations simultaneously, according to research by Rigetti Computing (opens in new tab).

Quantum computers are also relatively small because they do not rely on transistors like traditional machines. They also consume comparatively less power, meaning they could in theory be better for the environment.

You can read about how to get started in quantum computing in this article by Nature (opens in new tab). To learn more about the future of quantum computing, you can watch this TED Talk (opens in new tab) by PhD student Jason Ball.

Read the original here:
Quantum computing: Definition, facts & uses | Live Science

Quantum computing: D-Wave shows off prototype of its next quantum annealing computer – ZDNet

Image: Wacomka/Shutterstock

Quantum-computing outfit D-Wave has announced commercial access to an "experimental prototype" of its Advantage2 quantum annealing computer.

D-Wave is beating its own path to qubit processors with its quantum annealing approach. According to D-Wave, the Advantage2 prototype available today features over 500 qubits. It's a preview of a much larger Advantage2 it hopes to be available by 2024 with 7,000 qubits.

Access to the Advantage2 prototype is restricted to customers who have a D-Wave's Leap cloud service subscription, but developers interested in trying D-Wave's quantum cloud can sign up to get "one minute of free use of the actual quantum processing units (QPUs) and quantum hybrid solvers" that run on its earlier Advantage QPU.

The Advantage2 prototype is built with D-Wave's Zephyr connection technology that it claims offers higher connectivity between qubits than its predecessor topology called Pegasus, which is used in its Advantage QPU.

D-Wave says the Zephyr design enables shorter chains in its Advantage2 quantum chips, which can make them friendlier for calculations that require extra precision.

SEE:What is quantum computing? Everything you need to know about the strange world of quantum computers

"The Advantage2 prototype is designed to share what we're learning and gain feedback from the community as we continue to build towards the full Advantage2 system," says Emile Hoskinson, director of quantum annealing products at D-Wave.

"With Advantage2, we're pushing that envelope again demonstrating that connectivity and reduction in noise can be a delivery vehicle for even greater performance once the full system is available. The Advantage2 prototype is an opportunity for us to share our excitement and give a sneak peek into the future for customers bringing quantum into their applications."

While quantum computing is still experimental, senior execs are priming up for it as a business disruptor by 2030, according to a survey by consultancy EY. The firm found found that 81% of senior UK executives expect quantum computing to play a significant role in their industry by 2030.

Fellow consultancy McKinsey this month noted funding for quantum technology startups doubled in the past two years, from $700 million in 2020 to $1.4 billion in 2021. McKinsey sees quantum computing shaking up pharmaceuticals, chemicals, automotive, and finance industries, enabling players to "capture nearly $700 billion in value as early as 2035" through improved simulation and better machine learning. It expects revenues from quantum computing to exceed $90 billion by 2040.

D-Wave's investors include PSP Investments, Goldman Sachs, BDC Capital, NEC Corp, Aegis Group Partners, and the CIA's VC firm, In-Q-Tel.

See the rest here:
Quantum computing: D-Wave shows off prototype of its next quantum annealing computer - ZDNet

Businesses brace for quantum computing disruption by end of decade – The Register

While business leaders expect quantum computing to play a significant role in industry by 2030, some experts don't believe the tech is going to be ready for production deployment in the near future.

The findings, from a survey titled "2022 Quantum Readiness" commissioned by consultancy EY, refer to UK businesses, although it is likely that the conclusions are equally applicable to global organizations.

According to EY, 81 percent of senior UK executives expect quantum computing to have a significant impact in their industry within seven and a half years, with almost half (48 percent) believing that quantum technology will begin to transform industries as soon as 2025.

As for the naysayers who say quantum tech won't be ready for live deployment any time soon, the industry also suffers from a hype problem, with capabilities being exaggerated and even some accusations flying around of alleged falsification, as with the example of quantum startup IonQ that was recently accused by Scorpion Capital of misleading investors about the effectiveness of its quantum hardware.

Joseph Reger, Fujitsu Fellow, CTO of Central and Eastern Europe and Member of Quantum Computing Council of World Economic Forum, told The Register he is getting some "heat" for saying quantum is not nearly a thing yet.

"There are impressive advantages that pre-quantum or quantum-inspired technologies provide. They are less sexy, but very powerful."

He added: "Some companies are exaggerating the time scales. If quantum computing gets overhyped, we are likely to face the first quantum winter."

Fujitsu is itself developing quantum systems, and announced earlier this year that it was working to integrate quantum computing with traditional HPC technology. The company also unveiled a high performance quantum simulator based on its PRIMEHPC FX 700 systems that it said will serve as an important bridge towards the development of quantum computing applications in future.

Meanwhile, EY claims that respondents were "almost unanimous" in their belief that quantum computing will create a moderate or high level of disruption for their own organization, industry sector, and the broader economy in the next five years.

Despite this, the survey finds that strategic planning for quantum computing is still at an embryonic stage for most organizations, with only 33 percent involved in strategic planning for how quantum will affect them and only a quarter have appointed specialist leaders or set up pilot teams.

The survey conducted in February-March 2022 covered 501 UK-based executives, all with senior roles in their organisations, who had to demonstrate at least a moderate (but preferably a high) level of understanding of quantum computing. EY said they originally approached 1,516 executives, but only 501 met this requirement, which in and of itself tells a tale.

EY's Quantum Computing Leader, Piers Clinton-Tarestad, said the survey reveals a disconnect between the pace at which some industry leaders expect quantum to start affecting business and their preparedness for those impacts.

"Maximizing the potential of quantum technologies will require early planning to build responsive and adaptable organisational capabilities," he said, adding that this is a challenge because the progress of quantum has accelerated, but it is "not following a steady trajectory."

For example, companies with quantum processors have increased the power of their hardware dramatically over the past several years, from just a handful of qubits to over a hundred in the case of IBM, which expects to deliver a 4,158-qubit system by 2025. Yet despite these advances, quantum computers remain a curiosity, with most operational systems deployed in research laboratories or made available via a cloud service for developers to experiment with.

Clinton-Tarestad said "quantum readiness" is "not so much a gap to be assessed as a road to be walked," with the next steps in the process being regularly revisited as the landscape evolves. He warned businesses that expect to see disruption in their industry within the next three or five years need to act now.

According to EY's report, executives in consumer and retail markets are those most likely to believe that quantum will play a significant role by 2025, with just over half of technology, media and telecommunications (TMT) executives expecting an impact within the same time frame. Most respondents among health and life sciences companies think this is more likely to happen later, between 2026 and 2035.

Most organizations surveyed expect to start their quantum preparations within the next two years, with 72 percent aiming to start by 2024.

However, only a quarter of organizations have got as far as recruiting people with the necessary skills to lead quantum computing efforts, although 68 percent said they are aiming to set up pilot teams to explore the potential of quantum for their business by 2024.

Fear of falling behind because rival companies are working to develop their own quantum capabilities is driving some respondents to start quantum projects, while the applications of quantum computing anticipated by industry leaders would advance operations involving AI and machine learning, especially among financial services, automotive and manufacturing companies. TMT respondents cited potential applications in cryptography and encryption as being the most likely use of quantum computing.

While the EY report warns about companies potentially losing out to rivals on the benefits of quantum computing, there are also dangers that organizations should be preparing for now, as Intel warned about during its Intel Vision conference last month.

One of these is that quantum computers could be used to break current cryptographic algorithms, meaning that the confidentiality of both personal and enterprise data could be at risk. This is not a far-off threat, but something that organizations need to consider right now, according to Sridhar Iyengar, VP of Intel Labs and Director of Security and Privacy Research.

"Adversaries could be harvesting encrypted data right now, so that they can decrypt it later when quantum computers are available. This could be sensitive data, such as your social security number or health records, which are required to be protected for a long period of time," Iyengar told us.

Organizations may want to address threats like this by taking steps such as evaluating post-quantum cryptography algorithms and increasing the key sizes for current crypto algorithms like AES.

Or they may simply decide to adopt a wait and see attitude. EY will no doubt be on hand to sell consultancy services to help clarify their thinking.

Read the original post:
Businesses brace for quantum computing disruption by end of decade - The Register

McKinsey thinks quantum computing could create $80b in revenue … eventually – The Register

In the hype-tastic world of quantum computing, consulting giant McKinsey & Company claims that the still-nascent field has the potential to create $80 billion in new revenue for businesses across industries.

It's a claim McKinsey has repeated nearly two dozen times on Twitter since March to promote its growing collection of research diving into various aspects of quantum computing, from startup and government funding to use cases and its potential impact on a range of industries.

The consulting giant believes this $80 billion figure represents the "value at stake" for quantum computing players but not the actual value that use cases could create [PDF]. This includes companies working in all aspects of quantum computing, from component makers to service providers.

Despite wildly optimistic numbers, McKinsey does ground the report in a few practical realities. For instance, in a Wednesday report, the firm says the hardware for quantum systems "remains too immature to enable a significant number of use cases," which, in turn, limits the "opportunities for fledgling software players." The authors add that this is likely one of the reasons why the rate of new quantum startups entering the market has begun to slow.

Even the top of McKinsey's page for quantum computing admits that capable systems won't be ready until 2030, which is in line with what various industry players, including Intel, are expecting. Like fusion, it's always a decade or so away.

McKinsey, like all companies navigating if quantum computing has any real-world value, is trying to walk a fine line, exploring the possibilities of quantum computing while showing the ways the tech is still disconnected from ordinary enterprise reality.

"While quantum computing promises to help businesses solve problems that are beyond the reach and speed of conventional high-performance computers, use cases are largely experimental and hypothetical at this early stage. Indeed, experts are still debating the most foundational topics for the field," McKinsey wrote in a December 2021 article about how use cases "are getting real."

One could argue the report is something of a metaphor for the quantum industry in 2022. Wildl optimism about future ecosystem profitability without really understanding what the tech will mean and to whom--and at what scale.

Go here to see the original:
McKinsey thinks quantum computing could create $80b in revenue ... eventually - The Register

Quantum computing can solve EVs safety woes – Times of India

Recent incidents of electric vehicle (EV) catching fire has shocked the Indian ecosystem and hindered the broad adoption of these vehicles. Before March of this year, there has been a substantial rise in the demand for electric vehicles and rapid advances in innovation and technology. Improvements in the battery technology, through increased efficiency and range, have made the EVs more accessible to the mass public, as the sector is currently dominated by two-wheelers and three-wheelers in India. According to Mordor Intelligence, Indias electric vehicle market was valued at $1.4 trillion in 2021, and it is expected to reach $15.4 trillion by 2027, recording a CAGR of 47.09% over the forecast period (2022-2027). Since March, the challenge in EV has shifted from affordability, charging, and range anxiety to safety. Safety has been of prime importance and an EV catching fire has led to dire consequences and even fatal.

The question is, why is this happening?

A report by the Defence Research and Development Organisations (DRDO) Centre for Fire Explosive and Environment Safety points it to the EV batteries. The issues highlighted includes poor quality cells, lack of fuse, issues with thermal management, and battery management system (BMS).

The highlighted issues cause the batteries to experience Thermal Runaway problem, leading to the fires. This phenomenon occurs when an increase in temperature changes the conditions in a manner that causes further increase in temperature, often leading to a destructive result. The issue highlighted by the DRDO report are all potential causes of thermal runaway. Lets explain why.

Local atmospheric temperature directly affects the operating temperature of battery. For efficient performance, batterys operating temperature should be around 20-35 C. To keep the battery at this temperature, EVs need battery thermal management system (BTMS). Now, with rising temperatures in our cities, the BTMS are being challenged and possibly due to the poor thermal management system of EV batteries, thermal runaway is being caused.

Another cause for the thermal runaway, is possibly due to the rapid battery charging. With the evolution of battery technology, charging technology is also advancing. While the fast charging can greatly improve the convenience of EVs, it increases the risks related to batteries. Fast charging an EV can overheat the battery system, enough to melt the electrical wires and cause short circuits, leading to explosive consequences, as already seen by several charging-related incidents.

While hot weather conditions and inadequate thermal management systems of the battery can negatively impact performance and shorten life, they alone cannot cause thermal runaway. As mentioned by DRDO report, inefficient, or even absence of, fuse as a fail-safe mechanism is a missing component causing thermal runaway.

The causes of thermal runaway highlighted above could be due to either inefficient design or not enough testing by EV manufacturers. But the manufacturers cannot spend more time on increased testing due to time-to-market constraints.

Whats the solution?

As stated, design and testing phase are very important phases of any product manufacturing. Since the era of industry 4.0, all design and testing have moved digitally and carried out on large-scale powerful computers through what is called Engineering Simulations (referred to as Simulations hereafter). Simulations can be of various types some of which are thermal (studying the effect of heat and temperature on object), structural (studying effect of objects strength, stress, and failure), fluid (studying effect of flow in and around an object), and electrochemical (studying effect of chemistry on electricity). Thermal runaway is a complex engineering problem, entailing all the types of simulations mentioned above. With the right simulation tools, simulations allow to mimic every possible physical condition, rising temperature, fast charging, or fuse placement and find areas of problem. After identifying, it can also aid in testing different solutions and hence avoid thermal runaway all together.

The question then becomes why are we seeing the news at all?

Biggest issue EV manufactures have with performing numerous simulations is the duration of time. To run a series of simulations, it can take months to obtain results with minimal flaws and defects (high accuracy simulations). Manufacturers cannot afford this as it greatly hampers the time to market. Thus, companies opt for simulations that can provide solutions but with several minor flaws and defects (low accuracy simulations) to them, leading to large mishaps like EV explosions, system failures, and affecting human lives. In addition, if the companies do find some time to perform these simulations with minimum flaws and defects (high accuracy simulations), the cost that manufacturers incur is very high due to the need for supercomputers whether on-premises (setup and maintenance cost) or on cloud (due high duration time of the computing).

So the real issue is the computing technology bottleneck. This is where the next-generation computing technology of Quantum computers can step in and revolutionize the industries like EV and Battery Design. This new technology is much more powerful, enabling exponential abilities to these industries.

Prospect of Quantum-powered simulations

The power Quantum computers is showcased by its ability to perform the same simulations in much less time compared to classical supercomputers. Hence, this technology can significantly help EV manufacturers in their time to market.

Moreover, the ability to obtain high accuracy from simulations is vital in using them in the product development process. Since high accuracy simulations took lot of time before, making them prohibitive, quantum-powered simulations can now enable the manufacturers to perform accurate simulations at reasonable time, in hours instead of months. Added accuracy will not only help companies create more efficient designs and improve the reliability of their vehicles, but also help in saving something invaluable, i.e., Lives. In addition, the speedup from Quantum computations enables lower computing usages, decreasing the overall cost and making it affordable for EV manufacturers.

Whats next?

In the computing sphere, Quantum Computing is the revolutionizing system, changing our understanding of computations and shows tremendous potential as shown by various use cases. While the prospect of Quantum-powered simulations offers the advantage of Better, Faster, and Cheaper, the development is very challenging as the Quantum computers work in entirely different ways.

Good news is that companies are already developing & building Quantum-powered simulation software, which can solve problems of thermal runaway and optimization of BTMS. Quantum Computing is here and now!

Views expressed above are the author's own.

END OF ARTICLE

The rest is here:
Quantum computing can solve EVs safety woes - Times of India