Strategic Partnership Agreement to Develop the Quantum Computing Market in Japan and Asia-Pacific – PR Newswire

TOKYO, CAMBRIDGE, England and BROOMFIELD, Colo., Oct. 18, 2022 /PRNewswire/ -- Mitsui & Co., Ltd ("Mitsui") and Quantinuum have signed a strategic partnership agreement to collaborate in the delivery of quantum computing in Japan and the Asia-Pacific region.

Mitsui, which is committed to digital transformation, and Quantinuum, one of the world's leading quantum computing companies, integrated across hardware and software, have entered this strategic partnership to develop quantum computing use cases, which are expected to drive significant business transformation and innovation in the future.

Mitsui and Quantinuum will accelerate collaboration, cooperation, and development of new business models. They will jointly pursue quantum application development and provide value added services to organizations working across a variety of quantum computing domains, which is expected to be worth US$450B US$850B worldwide by 2040.*

Yoshio Kometani, Representative Director, Executive Vice President and Chief Digital Information Officer of Mitsui & Co., Ltd. stated:"We are very pleased with the strategic partnership between Mitsui and Quantinuum. By combining Quantinuum's cutting-edge quantum computing expertise and diverse quantum talents with Mitsui's broad business platform and network, we will work together to provide new value to our customers and create new business value in a wide range of industrial fields."

Ilyas Khan, Founder and CEO of Quantinuum stated:"The alliance between Mitsui and Quantinuum demonstrates our shared commitment to accelerating quantum computing across all applications and use cases in a diverse range of sectors, including chemistry, finance, and cybersecurity. Today's announcement reinforces our belief in the global quantum leadership shown by corporations and governments in Japan, pioneered by corporate leaders like Mitsui."

Details of the Strategic Partnership

Collaboration areas and applications

Recent Achievements by Quantinuum

About Mitsui & Co., Ltd.

Location: 1-2-1 Otemachi, Chiyoda-ku, Tokyo

Established: 1947

Representative: Kenichi Hori, President and Representative Director

Mitsui & Co., Ltd. (8031: JP) is a global trading and investment company with a diversified business portfolio that spans approximately 63 countries in Asia, Europe, North, Central & South America, The Middle East, Africa and Oceania.

Mitsui has about 5,500 employees and deploys talent around the globe to identify, develop, and grow businesses in collaboration with a global network of trusted partners. Mitsui has built a strong and diverse core business portfolio covering the Mineral and Metal Resources, Energy, Machinery and Infrastructure, and Chemicals industries.

Leveraging its strengths, Mitsui has further diversified beyond its core profit pillars to create multifaceted value in new areas, including innovative Energy Solutions, Healthcare & Nutrition and through a strategic focus on high-growth Asian markets. This strategy aims to derive growth opportunities by harnessing some of the world's main mega-trends: sustainability, health & wellness, digitalization and the growing power of the consumer.

Mitsui has a long heritage in Asia, where it has established a diverse and strategic portfolio of businesses and partners that gives it a strong differentiating edge, provides exceptional access for all global partners to the world's fastest growing region and strengthens its international portfolio.

For more information on Mitsui & Co's businesses visit, https://www.mitsui.com/jp/en/index.html

About Quantinuum

Location: Cambridge, U.K., Broomfield, Colorado, U.S.A.

Established: December 2021 (through the merger of Honeywell Quantum Solutions (U.S.) and Cambridge Quantum Computing (U.K.))

Representative: Ilyas Khan, CEO; Tony Uttley, COO; Shuya Kekke, CEO & Representative Director, Japan

Quantinuum is one of the world's largest integrated quantum computing companies, formed by the combination of Honeywell Quantum Solutions' world-leading hardware and Cambridge Quantum's class-leading middleware and applications. Science-led and enterprise-driven, Quantinuum accelerates quantum computing and the development of applications across chemistry, cybersecurity, finance, and optimization. Its focus is to create scalable and commercial quantum solutions to solve the world's most pressing problems in fields such as energy, logistics, climate change, and health. The company employs over 480 individuals, including 350 scientists, at nine sites across the United States, Europe, and Japan.

Selected major customers (in Japan): Nippon Steel Corporation, JSR Corporation

http://www.quantinuum.com

Photo - https://mma.prnewswire.com/media/1923231/Quantinuum.jpgPhoto - https://mma.prnewswire.com/media/1923232/Quantinuum_System_Model.jpg

SOURCE Quantinuum LLC

View post:

Strategic Partnership Agreement to Develop the Quantum Computing Market in Japan and Asia-Pacific - PR Newswire

Quantum Leap: "The big bang of quantum computing will come in this decade" – CTech

In the few images that IBM has released, its quantum computing lab looks like the engine room of a spaceship: bright white rooms with countless cables dangling from the ceiling down to a floating floor, pierced with vents. This technological tangle is just the background for the main show: rows of metal supports on which hang what look like... white solar boilers.

There, within these boilers, a historical revolution is taking shape. IBM, a computing dinosaur more than a century old, is trying to reinvent itself by winning one of the most grueling, expensive and potentially promising scientific races ever: the race to develop the quantum computer. "We are living in the most exciting era in the history of computing," says Dario Gil, Senior Vice President of IBM and head of the company's research division, in an exclusive interview with Calcalist. "We are witnessing a moment similar to the one recorded in the 40s & 50s of the last century, when the first classic computers were built." A few weeks after this conversation, his statements were further confirmed, when the Nobel Prize Committee announced the awarding of the prize in the field of physics to three researchers whose research served as a milestone in the development of the field.

The name Dario Gil shakes a lot of quanta and cells in the brains, and maybe even in the hearts, of physicists and computer engineers all over the world. This is the person who leads the most advanced effort in the world to develop a quantum computer. In September, when Gil landed in Tel Aviv for a short visit to give the opening lecture at the IBM conference, the hall was packed with senior engineers, researchers from the top universities in Israel, and representatives of government bodies - all enthralled by what Gil had to say.

2 View gallery

Dario Gil.

(Photo: Elad Gershgoren)

Gil (46) was born in Spain and moved to the United States to study at MIT University. He completed his doctoral studies there, and immediately after graduation began working at IBM in a series of research and development positions. Since 2019, he has been leading the company's research division, which has 3,000 engineers at 21 sites, including Israel. Under his management, in 2016, IBM built the first quantum computer whose services are available to anyone: if you have a complicated question, you can go to the IBM Quantum Experience website, remotely access one of the quantum computers through the cloud - and, perhaps, receive an answer. But as with everything related to quantum computing, it just sounds simple.

"Quantum computing is not just a name for an extremely fast computer," says Gill. In fact, he explains, the quantum computer is no longer a supercomputer that uses the same binary method that is accepted in every classical computer, but a completely new machine, another step in the evolution leading from strings of shells, through beaded invoices and calculating bars, to gear-based mechanical computers, to the electronic computer and now to the quantum computer. "Essentially, the quantum computer is a kind of simulator of nature, through which it is possible to simulate natural processes, and thus solve problems that previously had no solution," explains Gil. "If the classical computer is a combination of mathematics and information, then quantum computing is a combination of physics and information."

This connection makes it possible to solve certain types of problems with unprecedented speed: Google, which is also developing a quantum computer, claimed in 2019 that it had reached "quantum supremacy" a demonstration of a calculation that a quantum computer would perform more efficiently than a classical computer. The researchers at Google showed how a quantum computer performed in 200 seconds a calculation that they claim would have required a classical computer ten thousand years to complete. This claim has since been disproved by other researchers, who have presented an algorithm that allows a classical computer to perform the same calculation in a reasonable amount of timebut even this Google failure provides an idea of the enormous power a quantum computer will have.

"The quantum computer does not make the classical computer superfluous: they will live together, and each of them will solve different problems," explains Gil. "It's like asking you how to get from point A to point B: you can walk, ride a bicycle, travel by car or fly. If the distance between these points is 50 km, you won't fly between them, right? Accordingly, it is a mode suitable for a classic computer. A quantum computer allows you to fly, even to the moon, and quickly."

You will soon explain to me how it works, and in which areas exactly, but before that, let's start from the bottom line: what can we do with it?

"Quantum computing will make it possible to crack a series of problems that seemed unsolvable, in a way that will change the world. Many of these issues are related to energy. Others are related to the development of new and exciting materials. We tend to take the materials available to us for granted, but in the past there were eras that were defined by the materials that dominated them - The Stone Age', the 'Bronze Age', the 'Iron Age'. Quantum computing will help us develop materials with new properties, therefore the first sector that is already using it is industry, especially the car industry: the car manufacturers are interested in better chemistry, which will enable the production of more efficient and durable batteries for electric vehicles. For a normal computer this is a huge task, and to complete it we have to give up accuracy and settle for approximate answers only, but quantum computing can help quickly develop materials that will fit the task, even without entering the lab. The efficiency of a quantum computer when it comes to questions in chemistry is also used in the pharmaceutical industry, There they are beginning to make initial use of such computers to examine the properties of molecules, and in this way to speed up the development of new drugs; and also in the fertilizer industry, which will be able to develop substances whose production will not harm the environment.

The uses are not limited to the material world. "For the financial sector, for example, the quantum computer enables the analysis of scenarios, risk management and forecasting, and the industry is already very interested in such possible applications, which could provide the general public with dramatically improved performance in investment portfolios, for example.

2 View gallery

IBM.

(Photo: Shutterstock)

At the same time, there are industries that quantum computing will force to recalculate their course, and the information security industry is at the forefront. The modern encryption systems (mainly RSA, one of whose developers is the Israeli Prof. Adi Shamir) are asymmetric: each recipient publishes a code that allows the information sent to them to be encrypted ("public key"), which includes the product of two large prime numbers that are kept secret. To decipher the encrypted information, this product must be broken down into factors - but without knowing what the initial numbers are, "this task would require a normal computer to calculate for many years," explains Gil. "However, for the quantum computer, such a calculation can be a matter of seconds."

There is a real threat here to an entire industry, the logic behind which has been built since the 1970s, and now suddenly the ground is cracking under it.

"True, a normal computer needs ten thousand years to solve an encryption that a quantum computer would solve in an instant. That is why the quantum computer threatens the world of cyberspace and encryption, which are the basis of all global information security. This is an example that is not related to physics or nature, but simply to the stronger and faster computing power of the quantum computer.

The computer that works against all the rules of intuition

To understand the power of the quantum computer, this concept, "quantum computing", must first be broken down. The first step is to stop thinking in the familiar concepts of one and zero. Forget about bits and binaries. The key to understanding quantum computing is the recognition that this dichotomy is not there: instead of the bit, quantum computing relies on a basic unit of information called a qubit (short for "quantum bit"). The qubit is simultaneously one, zero and everything in between.

This is the moment to stop and explain the theory that underlies the quantum computer, and which seems to go against common sense. "Quantum theory makes it possible to explain the behavior of very, very small particles," Gil explains. "At school we are presented with a model of an atom that looks like a planet, with a nucleus and electrons moving around, but at the beginning of the 20th century, this model turned out to be not very accurate." This happened when physicists such as Max Planck and Albert Einstein realized that light, which until then physics saw as a wave, also behaves as a particle - and the energy of this particle can only be described in "quantum" jumps, that is, as discrete packets. In the decades that followed, this theory was developed more and more, and proved to be effective in describing a variety of phenomena in the world of particles. And yet, its deep meanings remain obscure even today.

Such is, for example, the idea that a particle is in more than one place. According to quantum theory, a particle moving between two points moves simultaneously in all the paths between them, a state called "superposition". It's not that we don't know its exact location: it just doesn't have one. Instead, it has a distribution of possible locations that coexist. In other words, reality is not certain, but probabilistic.

And this is not the only puzzle posed by quantum theory. Another confusing concept is "entanglement", a situation in which several particles exhibit identical physical values, and respond simultaneously to a change in one of them, even if they are at a great distance from each other. Gil suggests thinking of it as tossing two coins: anyone who has studied statistics knows that the probabilities of getting a "head" or a "tail" on each of them are independent. But in the quantum model, if the coins (representing particles here) are intertwined, then tossing one of them will result in the same result in the other. "Einstein didn't believe in interweaving, and hated these patterns," Gil says with a smile.

Measurements that affect the results? A reality that is not absolute but statistical? Particles that become twins even at infinite distance? If these ideas sound puzzling, incomprehensible or counter-intuitive to you, you are not alone: "Whoever comes across quantum theory and is not left stunned, has not understood it," said the physicist Niels Bohr, Einstein's contemporary and his great nemesis, who won the Nobel Prize for his contribution to the development of the theory (Einstein, by the way, had reservations about Bohr's interpretation of the theory's conclusions). Another physicist who won the Nobel Prize for his contribution to the theory, Richard Feynman, commented on this when he said: "If you think you have understood quantum theory, you have not."

The same Feynman is the father of quantum computing: he wanted to simulate the behavior of particles, but due to the probabilistic nature of the theory, a classical computer that would try to perform such a simulation would require an enormous amount of calculations, so that the simulation would become impractical. "Feynman, and like him other physicists, thought that the field of computing focused on mathematical horizons and moved too far away from nature, and that physics could be more connected to the world of information," explains Gil. "In a historic lecture he gave in 1981, Feynman claimed that there was nothing to give a classical computer to deal with particle simulation, because nature is not classical. He said, 'If we want to simulate nature, we need a machine that behaves like nature, in a quantum way.'" In 1998, this vision was realized, when the first quantum computer was built at the University of Oxford in Great Britain.

A quantum computer utilizes the enigmatic properties of quantum theory, those that are not fully understood by us, to perform calculation operations. In a normal computer, the basic unit of information is a "bit", which can have one of two values, 0 or 1; Using such bits makes it possible to perform any calculation imaginable - although some of these calculations may take a very long time. In a quantum computer, the qubit, thanks to superposition, represents not one absolute value, but a distribution of values. "You can think of it as a question of more dimensions: one and zero are just the ends, the poles of a coin for example, but it can also have a sideways tilt," explains Gil. Using statistical approaches it is possible to examine the state of the qubit and obtain useful results. This probabilistic approach is not suitable for every problem, but in solving certain problems it is infinitely more efficient than the classical computer's search for an absolute answer.

"Because of the entanglement effect, it is also possible to cause the qubits to influence each other," says Gil. And since each qubit represents an entire field of possibilities, each addition of a qubit increases the number of possible connections between the qubits with exponentially increasing power (in the classical computer, on the other hand, the addition of bits grows linearly). At the moment, IBM holds the record for qubits: last year it unveiled a quantum processor with 127 qubits, and its stated goal is to launch a processor with 433 qubits this year, and a processor with 1,021 qubits next year.

Three degrees colder than outer space

This ambition is more pretentious than it seems. It turns out that "building a machine that will behave like nature" is a complex story like no other: the qubits are very sensitive to outside influences, which makes building a computer a very complicated and expensive business. "The quantum computer is very powerful, but at the same time also very delicate," explains Gil: "It utilizes physical processes that occur in the world, but such processes are a system in which everything is connected, everything affects everything, and this can disrupt the results: if energy from the outside world goes inside and connect to the qubits, this will make them behave like normal bits, and thus the unique ability of quantum computation will be lost. Therefore, a quantum computer must be very isolated from the entire environment. The big challenge is to produce a system that is sufficiently isolated from the outside world, but not too isolated."

When I try to find out what the cost of building a quantum computer is - and IBM has already built 40 of them - Gil avoids a clear answer, but it is enough to hear what this effort entails: "There are several different approaches to building a quantum computer; IBM chose a cryogenic approach, meaning deep freezing, and the use of superconductors. The temperature in the computer is close to absolute zero: at the bottom of its case the temperature is minus 273 degrees Celsiusthree degrees less than the temperature of outer space, and less than one degree above absolute zero. The temperature should be close to absolute zero, but not reach it, because then there is no movement at all, Not even of the atoms."

The result is a cooling and protection case that resembles a water heater in its shape, and inside it has the calculation unit, whose shape gave it the nickname "chandelier" according to Gil and his team. "Inside the layers of protection there is a cylinder with the processor in it. Even if only a fraction of an energy particle enters the computer, literally a fraction of nothing, it will be enough to disrupt the results," Gil clarifies.

The great sensitivity, and the protection requirements derived from it, mean that the quantum computer is quite cumbersome: in the newest models, which try to include more and more qubits, the case already reaches a height of several meters. To some extent it is reminiscent of the first generations of classic computers, which looked like huge cabinets. Those classic computers kept getting smaller and smaller, until today we squeeze millions of times more computing power into a simple smartphone, but in the case of quantum computers, we cannot expect a similar process: "The quantum computer requires unique conditions that cannot be produced in a simple terminal device, and this will not change in the foreseeable future," Gil explains. "I believe that quantum computing will be a service that we can access remotely, as we access cloud services today. It will work similar to what IBM already enables today: the computer sits with us, and we make it possible to access the 'brain' and receive answers. Of the 40 computers we have built since 2016, today 20 are available to the public. About half a million users all over the world have already made use of the capabilities of the quantum computer we built, and based on this use, about a thousand scientific publications have already been published."

Google and Microsoft are heating up the competition

IBM is not the only company participating in the quantum computing race, but Gil exudes full confidence in its ability to lead it: according to him, most competitors only have parts of the overall system, but not a complete computer available to solve problems. Google, as mentioned, is a strong contender in this race, and it also allows remote access to its quantum computing service, Google Quantum AI; Microsoft is also working to provide a similar service on its cloud platform, Azure.

Meanwhile, quantum computing is a promise "on paper". The theoretical foundations for this revolution were laid already 40 years ago, the first proofs were presented more than 20 years ago, the industry has been buzzing around this field for several years - and we still haven't seen uses that would serve a regular person.

"If you go back to the 1940s, when the first computers were invented, you will see that even then the uses and advantages of the new invention were not clear. Those who saw the first computers said, 'Oh, great, you can use it to crack the code of encryption machines in wars, maybe even calculate routes of ballistic missiles, and that's it. Who's going to use it? Nobody,'" Gil laughs. "In the same way, the success of quantum computing will depend on its uses: how easy it will be to program, how large the community of users will be, what talents will get there. The quantum revolution will be led by a community, which is why education for this field is so important: we need more and more smart people to start to think 'how can I use quantum computing to advance my field'.

"What is beginning these days is the democratization phase of quantum computing, which will allow anyone to communicate with the computer without being an advanced programmer in the field: it will be possible to approach it with a question or a task that will be written in the classical languages of one or zero. That is why we are already seeing more use of quantum computing capacity today.

"There are also many startups that do not actually work to establish a quantum computer, but focus on various components of this world (for example, the Israeli company Quantum Machines, which develops hardware and software systems for quantum computers, and last July was selected by the Innovation Authority to establish the Israeli Quantum Computing Center). The activity of such companies creates a completely new ecosystem, thus promoting the industry and accelerating its development, just as is happening today in the field of ordinary computers. IBM will not rely only on itself either: we would like to benefit from the innovation of smart people in this field, of course also in Israel.

"I am convinced that the big bang of quantum computing will happen in this decade. Our ambition at IBM is to demonstrate 'quantum supremacy' already in the next three years. I believe that the combination of advances in artificial intelligence, together with quantum computing, will bring about a revolution in the industry of the kind that Nvidia made in its market (Nvidia developed unique processors for gaming computers, which made it the chip company that reached a billion dollar revenue the fastest.) Quantum computing can generate enormous value in the industry. It is phenomenally difficult, but it is clear to me that we will see the uses already in the current decade."

The Nobel Prize opens a new horizon for quantum computing

Quantum computing has ignited the imagination of researchers for many decades, but until now it has not left the confines of laboratories. However, the awarding of the Nobel Prize to three researchers in the field indicates that the vision is becoming a real revolution. Alain Aspect of France, the American John Clauser and Austrian Anton Zeilinger received the award for research they conducted (separately) since the 1970s, in which they examined the phenomenon of quantum entanglement (described in the article), proved its existence and laid tracks for its technological use.

The awarding of the Nobel Prize to the entanglement researchers proves that quantum computing is more than a mental exercise for a sect of physicists, and is a defining moment for companies that invest capital in the development of the field. They are pushed to this effort due to a fundamental change in the world in which they operate: in recent decades, the world of computing has operated according to "Moore's Law", which foresees that the density of transistors in computer processors will double every two years in a way that will increase the computing power of these chips. However, as the industry approaches the physical limit after which it will be impossible to cram more transistors onto a chip, the need to develop a quantum computer has become acute.

The numbers also signal that something is happening in the field. In 2020, the scope of the quantum computing market was less than half a billion dollars, but at the end of 2021, in a signal that the vision is beginning to be realized, the research company IDC published an estimate according to which in 2027 the scope of the market will reach $8.6 billion and investments in the field will amount to $16 billion (compared to $700 million in 2020 and $1.4 billion in 2021). IBM CEO Arvind Krishna also recently estimated that in 2027 quantum computing will become a real commercial industry.

Read more from the original source:

Quantum Leap: "The big bang of quantum computing will come in this decade" - CTech

VW teams with Canadian quantum computing company Xanadu on batteries – Automotive News Canada

Quantum computing, Ardey added in a release, might trigger a revolution in material science that will feed into the companys in-house battery expertise.

Leaving the bits and bytes of classical computing behind, quantum computers rely on qubits, and are widely seen as having potential to solve complex problems that traditional computers could not work through on reasonable timelines.

The automaker and Toronto-based technology firm have already been collaborating on research into material science, computational chemistry, and quantum algorithms for about a year. That early work set the foundation for the formal partnership, Volkswagen said.

The goal of the research is to develop quantum algorithms that can simulate how a blend of battery materials will interact more quickly than traditional computer models. Computational chemistry, which is traditionally used for such work, Ardey said, is reaching limitations when it comes to battery research.

Juan Miguel Arrazola, head of algorithms at Xanadu, said the partnership is part of the Canadian companys drive to make quantum computers truly useful.

Focusing on batteries is a strategic choice given the demand from industry and the prospects for quantum computing to aid in understanding the complex chemistry inside a battery cell.

Using the quantum algorithms, Volkswagen said it aims to develop battery materials that are safer, lighter and cheaper.

Go here to read the rest:

VW teams with Canadian quantum computing company Xanadu on batteries - Automotive News Canada

Cleveland Clinic and IBM Begin Installation of IBM Quantum System One – Cleveland Clinic Newsroom

Cleveland Clinicand IBM have begundeployment of the first private sector onsite,IBM-managedquantum computer in the United States.The IBM Quantum Systemis to be located on Cleveland Clinics main campus in Cleveland.

The first quantum computer in healthcare, anticipated to be completed in early 2023, is a key part of the two organizations10-year partnership aimed at fundamentally advancing the pace of biomedical research through high-performance computing. Announced in 2021, the Cleveland Clinic-IBM Discovery Accelerator is a joint center that leverages Cleveland Clinics medical expertise with the technology expertise of IBM, including its leadership in quantum computing.

The current pace of scientific discovery is unacceptably slow, while our research needs are growing exponentially, said Lara Jehi, M.D., Cleveland Clinics Chief Research Information Officer. We cannot afford to continue to spend a decade or more going from a research idea in a lab to therapies on the market. Quantum offers a future to transform this pace, particularly in drug discovery and machine learning.

A step change in the way we solve scientific problems is on the horizon, said Ruoyi Zhou, Director, Ph.D., IBM Research Cleveland Clinic Partnership. At IBM, were more motivated than ever to create with Cleveland Clinic and others lasting communities of discovery and harness the power of quantum computing, AI and hybrid cloud to usher in a new era of accelerated discovery in healthcare and life sciences.

The Discovery Accelerator at Cleveland Clinic draws upon a variety of IBMs latest advancements in high performance computing, including:

Lara Jehi, M.D., and Ruoyi Zhou, Ph.D., at the site of the IBM Quantum System One on Cleveland Clinics main campus. (Courtesy: Cleveland Clinic/IBM)

The Discovery Accelerator also serves as the technology foundation for Cleveland Clinics Global Center for Pathogen Research & Human Health, part of the Cleveland Innovation District. The center, supported by a $500 million investment from the State of Ohio, Jobs Ohio and Cleveland Clinic, brings together a team focused on studying, preparing and protecting against emerging pathogens and virus-related diseases. Through Discovery Accelerator, researchers are leveraging advanced computational technology to expedite critical research into treatments and vaccines.

Together, the teams have already begun several collaborative projects that benefit from the new computational power. The Discovery Accelerator projects include a research study developing a quantum computing method to screen and optimize drugs targeted to specific proteins; improving a prediction model for cardiovascular risk following non-cardiac surgery; and using artificial intelligence to search genome sequencing findings and large drug-target databases to find effective, existing drugs that could help patients with Alzheimers and other diseases.

A significant part of the collaboration is a focus on educating the workforce of the future and creating jobs to grow the economy. An innovative educational curriculum has been designed for participants from high school to professional level, offering training and certification programs in data science, machine learning and quantum computing to build the skilled workforce needed for cutting-edge computational research of the future.

Read more from the original source:

Cleveland Clinic and IBM Begin Installation of IBM Quantum System One - Cleveland Clinic Newsroom

CEO Jack Hidary on SandboxAQ’s Ambitions and Near-term Milestones – HPCwire

Spun out from Google last March, SandboxAQ is a fascinating, well-funded start-up targeting the intersection of AI and quantum technology. As the world enters the third quantum revolution, AI + Quantum software will address significant business and scientific challenges, is the companys broad self-described mission. Part software company, part investor, SandboxAQ foresees a blended classical computing-quantum computing landscape with AI infused throughout.

Its developing product portfolio comprises enterprise software for assessing and managing cryptography/data security in the so-called post-quantum era. NIST, of course, released its first official post-quantum algorithms in July and SandboxAQ is one of 12 companies selected to participate in its newproject Migration to Post Quantum Cryptography to build and commercialize tools. SandboxAQs AQ Analyzer product, says the company, is already available and being used by a few marquee customers.

Then theres SandboxAQs Strategic Investment Program, announced in August, which acquires or invests in technology companies of interest. So far, it has acquired one company (Cryptosense) and invested in two others (evolutionQ, and Qunnect).

Last week, HPCwire talked with SandboxAQ CEO Jack Hidary about the companys products and strategy. One has the sense that SandboxAQs aspirations are broad, and with nine figure funding, it has the wherewithal to pivot or expand. The A in the name stands for AI and the Q stands for quantum. One area not on the current agenda: building a quantum computer.

We want to sit above that layer. All these [qubit] technologies ion trap, and NV center (nitrogen vacancy center), neutral atoms, superconducting, photonic are very interesting and we encourage and mentor a lot of these companies who are quantum computing hardware companies. But we are not going to be building one because we really see our value as a layer on top of those computing [blocks], said Hidary. Google, of course, has another group working on quantum hardware.

Hidary joined Google in 2016 as Sandbox group director. A self-described serial entrepreneur, Hidarys varied experience includes founding EarthWeb, being a trustee of the XPrize Foundation, and running for Mayor in New York City in 2013. While at Google Sandbox, he wrote a textbook Quantum Computing: An Applied Approach.

I was recruited in to start a new division to focus on the use of AI and ultimately also quantum in solving really hard problems in the world. We realized that we needed to be multi-platform and focus on all the clouds and to do [other] kinds of stuff so we ended up spinning out earlier this year, said Hidary.

Eric Schmidt joined us about three and a half years ago as he wrapped up his chairmanship at Alphabet (Google parent company). He got really into what were doing, looking at the impact that scaled computation can have both on the AI side and the quantum side. He became chairman of SandboxAQ. I became CEO. Weve other backers like Marc Benioff from Salesforce and T. Rowe Price and Guggenheim, who are very long-term investors. What youll notice here thats interesting is we dont have short-term VCs. Wehave really long term investors who are here for 10 to 15 years.

The immediate focus is on post quantum cryptography tools delivered mostly by a SaaS model. By now were all familiar with the threat that fault-tolerant quantum computers will be able to crack conventionally encrypted (RSA) data using Shors algorithm. While fault-tolerant quantum computers are still many years away, the National Institute of Standards and Technology (NIST) and others, including SandboxAQ, have warned against Store Now/Decrypt Later attacks. (See HPCwire article, The Race to Ensure Post Quantum Data Security).

What adversaries are doing now is siphoning off information over VPNs. Theyre not cracking into your network. Theyre just doing it over VPNs, siphoning that information. They cant read it today, because its RSA protected, but theyll store it and read it in a number of years when they can, he said. The good news is you dont have to scrap your hardware. You could just upgrade the software. But thats still a monumental challenge. As you can imagine, for all the datacenters and the high-performance computing centers this is a non-trivial operation to do all that.

A big part of the problem is simply finding where encryption code is in existing infrastructure. That, in turn, has prompted calls for what is being called crypto-agility a comprehensive yet modular approach that allows easy swapping in-and-out cryptography code.

We want crypto-agility, and what we find is large corporations, large organizations, and large governments dont have crypto-agility. What were hoping is to develop tools to implement this idea. For example, as a first step to crypto-agility, were trying to see if people even have an MRI (discovery metaphor) machine for use on their own cybersecurity, and they really dont when it comes to encryption. Theres no diagnostic tools that these companies are using to find where their [encryption] footprint is or if they are encrypting everything appropriately. Maybe some stuff is not even being encrypted, said Hidary, who favors the MRI metaphor for a discovery tool.

No doubt, the need to modernize encryption/decryption methods and tools represents a huge problem and a huge market.

Without getting into technical details, Hidary said SandboxAQ is leveraging technology from its recent Cryptosense acquisition and internally developed technologies to develop a product portfolio planned to broadly encompass cryptography assessment, deployment and management. Its core current product is AQ Analyzer.

The idea, says Hidary returning to the MRI metaphor, is to take an MRI scan of inside the organization on-premise, cloud, private cloud, and so forth and this feeds into compliance vulnerabilities and post-quantum analysis. Its not just a quantum thing. Its about your general vulnerabilities on encryption. Overall, it happens to be that post quantum is helped by this, but this is a bigger issue. Then that feeds into your general sysops, network ops, and management tools that youre using.

AQ Analyzer, he says, is enterprise software that starts the process for organizations to become crypto-agile. Its now being used at large banks and telcos, and also by Mount Sinai Hospital. Healthcare replete with sensitive information is another early target for SandboxAQ. Long-term the idea is for Sandbox software tools to be able to automate much of the crypto management process from assessment to deployment through ongoing monitoring and management.

Thats the whole crypto-agility ballgame, says Hidary.

The business model, says Hidary, is carbon copy of Salesforce.coms SaaS model. Broadly, SandboxAQ uses a three-prong go-to-market via direct sales, global systems integrators in May it began programs with Ernst & Young (EY) and Deloitte and strategic partners/resellers. Vodafone and SoftBank are among the latter. Even though these are still early days for SandboxAQ as an independent entity, its moving fast, having benefitted from years of development inside Google. AQ Analyzer, said Hidary, is in general availability.

Were doing extremely well in banks and financial institutions. Theyre typically early adopters of cybersecurity because of the regulatory and compliance environment, and the trust they have with their customers, said Hidary.

Looking at near-term milestones, he said, Wed like to see a more global footprint of banks. Well be back in Europe soon now that we have Cryptosense (UK and Paris-based), and we have a local strong team in Europe. Weve had a lot of traction in the U.S. and the Canadian markets. So thats one key milestone over the next 18 months or so. Second, wed like to see [more adoption] into healthcare and telcos. We have Vodafone and Softbank mobile, on the telco side. We have Mount Sinai, wed like to see if that can be extended into additional players in those two spaces. The fourth vertical well probably go into is the energy grid. These are all critical infrastructure pieces of our society the financial structure of our society, energy, healthcare and the medical centers, the telecommunications grid.

While SandboxAQs AQ Analyzer is the companys first offering, its worth noting that the company aggressively looking for niches it can serve. For example, the company is keeping close tab on efforts to build a quantum internet.

Theres going to be a parallel quantum coherent internet to connect for distributed quantum computing, said Hidary. So nothing to do with cyber at all.

Our vision of the future that we share with I think everyone in the industry is that quantum does not take over classical, said Hidary. Its a mesh, a hybridization of CPU, GPU and quantum processing units. And the program, the code, in Python for example: part of it runs on CPUs, part of it on GPUs, and then yes, part of it will run on a QPU. In that mesh, youd want to have access both to the traditional Internet TCP IP today, but you also want to be able to connect over a quantum coherence intranet. So thats Qunnect.

Qunnect, of course, is one of the companies SandboxAQ has invested in and it is working on hardware (quantum memory and repeaters) to enable a quantum internet. Like dealing with post quantum cryptography, outfitting the quantum internet is likely to be as huge business. Looking at SandboxAQ, just seven months after being spun out from Google, the scope of its ambitions is hard to pin down.

Stay tuned.

Here is the original post:

CEO Jack Hidary on SandboxAQ's Ambitions and Near-term Milestones - HPCwire

The world, and todays employees, need quantum computing more than ever – VentureBeat

Did you miss a session from MetaBeat 2022? Head over to the on-demand library for all of our featured sessions here.

Quantum computing can soon address many of the worlds toughest, most urgent problems.

Thats why the semiconductor legislation Congress just passed is part of a $280 billion package that will, among other things, direct federal research dollars toward quantum computing.

Quantum computing will soon be able to:

The economy and the environment are clearly two top federal government agenda items.Congress in July was poised to pass the most ambitious climate bill in U.S. history. The New York Times said that the bill would pump hundreds of billions of dollars into low-carbon energy technologies like wind turbines, solar panels and electric vehicles and would put the United States on track to slash its greenhouse gas emissions to roughly 40% below 2005 levels by 2030. This could help to further advance and accelerate the adoption of quantum computing.

Low-Code/No-Code Summit

Join todays leading executives at the Low-Code/No-Code Summit virtually on November 9. Register for your free pass today.

Because quantum technology can solve many previously unsolvable problems, a long list of the worlds leading businesses including BMW and Volkswagen, FedEx, Mastercard and Wells Fargo, and Merck and Roche are making significant quantum investments. These businesses understand that transformation via quantum computing, which is quickly advancing with breakthrough technologies, is coming soon. They want to be ready when that happens.

Its wise for businesses to invest in quantum computing because the risk is low and the payoff is going to be huge. As BCG notes: No one can afford to sit on the sidelines as this transformative technology accelerates toward several critical milestones.

The reality is that quantum computing is coming, and its likely not going to be a standalone technology. It will be tied to the rest of the IT infrastructure supercomputers, CPUs and GPUs.

This is why companies like Hewlett Packard Enterprise are thinking about how to integrate quantum computing into the fabric of the IT infrastructure. Its also why Terra Quantum AG is building hybrid data centers that combine the power of quantum and classical computing.

Amid these changes, employees should start now to get prepared. There is going to be a tidal wave of need for both quantum Ph.D.s and for other talent such as skilled quantum software developers to contribute to quantum efforts.

Earning a doctorate in a field relevant to quantum computing requires a multi-year commitment. But obtaining valuable quantum computing skills doesnt require a developer to go back to college, take out a student loan or spend years studying.

With modern tools that abstract the complexity of quantum software and circuit creation, developers no longer require Ph.D.-level knowledge to contribute to the quantum revolution, enabling a more diverse workforce to help businesses achieve quantum advantage. Just look at the winners in the coding competition that my company staged. Some of these winners were recent high school graduates, and they delivered highly innovative solutions.

Leading the software stack, quantum algorithm design platforms allow developers to design sophisticated quantum circuits that could not be created otherwise. Rather than defining tedious low-level gate connections, this approach uses high-level functional models and automatically searches millions of circuit configurations to find an implementation that fits resource considerations, designer-supplied constraints and the target hardware platform. New tools like Nvidias QODA also empower developers by making quantum programming similar to how classical programming is done.

Developers will want to familiarize themselves with quantum computing, whichwill be an integral arrow in their metaphorical quiver of engineering skills. People who add quantum skills to their classical programming and data center skills will position themselves to make more money and be more appealing to employers in the long term.

Many companies and countries are experimenting with and adopting quantum computing. They understand that quantum computing is evolving rapidly and is the way of the future.

Whether you are a business leader or a developer, its important to understand that quantum computing is moving forward. The train is leaving the station will you be on board?

Erik Garcell is technical marketing manager at Classiq.

Welcome to the VentureBeat community!

DataDecisionMakers is where experts, including the technical people doing data work, can share data-related insights and innovation.

If you want to read about cutting-edge ideas and up-to-date information, best practices, and the future of data and data tech, join us at DataDecisionMakers.

You might even considercontributing an articleof your own!

Read More From DataDecisionMakers

Go here to see the original:

The world, and todays employees, need quantum computing more than ever - VentureBeat

Cancer to Be Treated as Easily as Common Cold When Humans Crack Quantum Computing – Business Wire

DUBAI, United Arab of Emirates--(BUSINESS WIRE)--Breakthroughs in quantum computing will enable humans to cure diseases like cancer, Alzheimers, and Parkinsons as easily as we treat the common cold.

That was one of the major insights to emerge from the Dubai Future Forum, with renowned theoretical physicist Dr. Michio Kaku telling the worlds largest gathering of futurists that humanity should brace itself for major transformations in healthcare.

The forum concluded with a call for governments to institutionalize foresight and engrain it within decision making.

Taking place in Dubai, UAE at the Museum of the Future, Amy Webb, CEO of Future Today Institute, criticized nations for being too pre-occupied with the present and too focused on creating white papers, reports and policy recommendations instead of action.

Nowism is a virus. Corporations and governments are infected, she said.

One panel session heard how humans could be ready to test life on the Moon in just 15 years and be ready for life on Mars in another decade. Sharing his predictions for the future, Dr. Kaku also said there is a very good chance humans will pick up a signal from another intelligent life form this century.

Dr. Jamie Metzl, Founder and Chair, OneShared.World, urged people to eat more lab-grown meat to combat global warming and food insecurity.

If we are treating them like a means to an end of our nutrition, wouldnt it be better instead of growing the animal, to grow the meat? he said.

Among the 70 speakers participating in sessions were several UAE ministers. HE Mohammad Al Gergawi, UAE Minister of Cabinet Affairs, Vice Chairman, Board of Trustees and Managing Director of the Dubai Future Foundation, said ministers around the world should think of themselves as designers of the future. Our stakeholders are 7.98 billion people around the world, he noted.

Dubais approach to foresight was lauded by delegates, including HE Omar Sultan Al Olama, UAE Minister of State for Artificial Intelligence, Digital Economy, and Remote Work Applications, who said: What makes our city and nation successful is not natural resources, but a unique ability to embrace all ideas and individuals.

More than 30 sessions covered topics including immortality, AI sentience, climate change, terraforming, genome sequencing, legislation, and the energy transition.

*Source: AETOSWire

Follow this link:

Cancer to Be Treated as Easily as Common Cold When Humans Crack Quantum Computing - Business Wire

New laboratory to explore the quantum mysteries of nuclear materials – EurekAlert

Replete with tunneling particles, electron wells, charmed quarks and zombie cats, quantum mechanics takes everything Sir Isaac Newton taught about physics and throws it out the window.

Every day, researchers discover new details about the laws that govern the tiniest building blocks of the universe. These details not only increase scientific understanding of quantum physics, but they also hold the potential to unlock a host of technologies, from quantum computers to lasers to next-generation solar cells.

But theres one area that remains a mystery even in this most mysterious of sciences: the quantum mechanics of nuclear fuels.

Until now, most fundamental scientific research of quantum mechanics has focused on elements such as silicon because these materials are relatively inexpensive, easy to obtain and easy to work with.

Now, Idaho National Laboratory researchers are planning to explore the frontiers of quantum mechanics with a new synthesis laboratory that can work with radioactive elements such as uranium and thorium.

An announcement about the new laboratory appears online in the journalNature Communications.

Uranium and thorium, which are part of a larger group of elements called actinides, are used as fuels in nuclear power reactors because they can undergo nuclear fission under certain conditions.

However, the unique properties of these elements, especially the arrangement of their electrons, also means they could exhibit interesting quantum mechanical properties.

In particular, the behavior of particles in special, extremely thin materials made from actinides could increase our understanding of phenomena such as quantum wells and quantum tunneling (see sidebar).

To study these properties, a team of researchers has built a laboratory around molecular beam epitaxy (MBE), a process that creates ultra-thin layers of materials with a high degree of purity and control.

The MBE technique itself is not new, said Krzysztof Gofryk, a scientist at INL. Its widely used. Whats new is that were applying this method to actinide materials uranium and thorium. Right now, this capability doesnt exist anywhere else in the world that we know of.

The INL team is conducting fundamental research science for the sake of knowledge but the practical applications of these materials could make for some important technological breakthroughs.

At this point, we are not interested in building a new qubit [the basis of quantum computing], but we are thinking about which materials might be useful for that, Gofryk said. Some of these materials could be potentially interesting for new memory banks and spin-based transistors, for instance.

Memory banks and transistors are both important components of computers.

To understand how researchers make these very thin materials, imagine an empty ball pit at a fast-food restaurant. Blue and red balls are thrown in the pit one at a time until they make a single layer on the floor. But that layer isnt a random assortment of balls. Instead, they arrange themselves into a pattern.

During the MBE process, the empty ball pit is a vacuum chamber, and the balls are highly pure elements, such as nitrogen and uranium, that are heated until individual atoms can escape into the chamber.

The floor of our imaginary ball pit is, in reality, a charged substrate that attracts the individual atoms. On the substrate, atoms order themselves to create a wafer of very thin material in this case, uranium nitride.

Back in the ball pit, weve created layer of blue and red balls arranged in a pattern. Now we make another layer of green and orange balls on top of the first layer.

To study the quantum properties of these materials, Gofryk and his team will join two dissimilar wafers of material into a sandwich called a heterostructure. For instance, the thin layer of uranium nitride might be joined to a thin layer of another material such as gallium arsenide, a semiconductor. At the junction between the two different materials, interesting quantum mechanical properties can be observed.

We can make sandwiches of these materials from a variety of elements, Gofryk said. We have lots of flexibility. We are trying to think about the novel structures we can create with maybe some predicted quantum properties.

We want to look at electronic properties, structural properties, thermal properties and how electrons are transported through the layers, he continued. What will happen if you lower the temperature and apply a magnetic field? Will it cause electrons to behave in certain way?

INL is one of the few places where researchers can work with uranium and thorium for this type of science. The amounts of the radioactive materials and the consequent safety concerns will be comparable to the radioactivity found in an everyday smoke alarm.

INL is the perfect place for this research because were interested in this kind of physics and chemistry, Gofryk said.

In the end, Gofryk hopes the laboratory will result in breakthroughs that help attract attention from potential collaborators as well as recruit new employees to the laboratory.

These actinides have such special properties, he said. Were hoping we can discover some new phenomena or new physics that hasnt been found before.

In 1900, German physicist Max Planck first described how light emitted from heated objects, such as the filament in a light bulb, behaved like particles.

Since then, numerous scientists including Albert Einstein and Niels Bohr have explored and expanded upon Plancks discovery to develop the field of physics known as quantum mechanics. In short, quantum mechanics describes the behavior of atoms and subatomic particles.

Quantum mechanics is different than regular physics, in part, because subatomic particles simultaneously have characteristics of both particles and waves, and their energy and movement occur in discrete amounts called quanta.

More than 120 years later, quantum mechanics plays a key role in numerous practical applications, especially lasers and transistors a key component of modern electronic devices. Quantum mechanics also promises to serve as the basis for the next generation of computers, known as quantum computers, which will be much more powerful at solving certain types of calculations.

Uranium, thorium and the other actinides have something in common that makes them interesting for quantum mechanics: the arrangement of their electrons.

Electrons do not orbit around the nucleus the way the earth orbits the sun. Rather, they zip around somewhat randomly. But we can define areas where there is a high probability of finding electrons. These clouds of probability are called orbitals.

For the smallest atoms, these orbitals are simple spheres surrounding the nucleus. However, as the atoms get larger and contain more electrons, orbitals begin to take on strange and complex shapes.

In very large atoms like uranium and thorium (92 and 90 electrons respectively), the outermost orbitals are a complex assortment of party balloon, jelly bean, dumbbell and hula hoop shapes. The electrons in these orbitals are high energy. While scientists can guess at their quantum properties, nobody knows for sure how they will behave in the real world.

Quantum tunneling is a key part of any number of phenomena, including nuclear fusion in stars, mutations in DNA and diodes in electronic devices.

To understand quantum tunneling, imagine a toddler rolling a ball at a mountain. In this analogy, the ball is a particle. The mountain is a barrier, most likely a semiconductor material. In classical physics, theres no chance the ball has enough energy to pass over the mountain.

But in the quantum realm, subatomic particles have properties of both particles and waves. The waves peak represents the highest probability of finding the particle. Thanks to a quirk of quantum mechanics, while most of the wave bounces off the barrier, a small part of that wave travels through if the barrier is thin enough.

For a single particle, the small amplitude of this wave means there is a very small chance of the particle making it to the other side of the barrier.

However, when large numbers of waves are travelling at a barrier, the probability increases, and sometimes a particle makes it through. This is quantum tunneling.

Quantum wells are also important, especially for devices such as light emitting diodes (LEDs) and lasers.

Like quantum tunneling, to build quantum wells, you need alternating layers of very thin (10 nanometers) material where one layer is a barrier.

While electrons normally travel in three dimensions, quantum wells trap electrons in two dimensions within a barrier that is, for practical purposes, impossible to overcome. These electrons exist at specific energies say the precise energies needed to generate specific wavelengths of light.

About Idaho National LaboratoryBattelle Energy Alliance manages INL for the U.S. Department of Energys Office of Nuclear Energy. INL is the nations center for nuclear energy research and development,and alsoperforms research in each of DOEs strategic goal areas: energy, national security, science and the environment. For more information, visitwww.inl.gov.Follow us on social media:Twitter,Facebook,InstagramandLinkedIn.

View post:

New laboratory to explore the quantum mysteries of nuclear materials - EurekAlert

1836, the Slaveholder Republic’s Birthday – The Texas Observer

History, its often said, is written by the victors. While that isnt always true, its certainly borne out by many popular accounts of the Texas Revolution of 1835-36, which often tell a very black-and-white story of the virtuous Texansthe victorsfighting against the evil Mexicans. The San Jacinto Monument inscription, for instance, blames the rebellion on unjust acts and despotic decrees of unscrupulous rulers in Mexico. A pamphlet produced by the Republican Party-sponsored 1836 Project says that Anglo settlers fought to preserve constitutional liberty and republican government.

University of Houston history professor Gerald Horne tells a very different story. In The Counter-Revolution of 1836: Texas Slavery & Jim Crow and the Roots of American Fascism, published earlier this year, Horne contends that the motivation behind the Anglo-American rebellion was anything but virtuous: to make Texas safe for slavery and white supremacy. For othersBlacks, the Indigenous peoples, and many Tejanosthe Anglo victory meant slavery, oppression, dispossession, and in many cases, death.

The Counter-Revolution of 1836 is a big, sprawling book (over 570 pages), as befits its scope: It takes the reader from the lead-up to the Texas rebellion, through independence, annexation, the Civil War, Reconstruction, and Jim Crow, to the early 1920s. It is scrupulously researched, drawing not only on other scholars but also on a wide range of sources from the times, including letters, speeches, newspaper articles, and diplomatic posts.

Given current right-wing efforts to expel discussions of systemic racism from Texas classrooms, Hornes book is an important contribution to the ongoing debate over our collective history.

Recently, Horne discussed the book and its implications with the Texas Observer via email.

As the title indicates, you contend that the Texas Revolution was in fact a counter-revolution. What does counter-revolution mean to you, and why do you think its a more accurate designation?

The title suggests that the 1836 revolt was in response to abolitionism south of the border and thus was designed to stymie progress. A revolution, properly understood, should advance progress. [The] counter-revolution in 1836 assuredly was a step forward for many European settlersnot so much for Africans and the Indigenous.

This book continues the story you begin in your 2014 book on the American rebellion against England (1775-83). In that book, you similarly contend that the American Revolution was a counter-revolution. Why do you think so?

Similarly, 1776 was designed to stymie not only the prospect of abolitionism, but as well to sweep away what was signaled by the Royal Proclamation of 1762-3 which expressed Londons displeasure at continuing to expend blood and treasure ousting Indigenous peoples for the benefit of real estate speculators, e.g., George Washington. Not coincidentally, nationals from the post-1776 republic [the United States] were essential to the success of the 1836 counter-revolution.

You refer to the pre-emancipation United States and the pre-annexation Republic of Texas as slaveholder republics. Some readers may bristle at this label, especially those who believe, as anti-critical race theory Senate Bill 3 puts it, that slavery was not central to the American founding, but was merely a failur[e] to live up to the authentic founding principles of the United States. Why do you think the term slaveholder republic is a more accurate description?

Slaveholding republic is actually a term popularized by the late Stanford historianand Pulitzer Prize winnerDon Fehrenbacher. It is an indicator of regressionan offshoot of counter-revolutionthat this accurate descriptor is now deemed to be verboten. This ruse of suggesting that every blemishor atrocityis inconsistent with founding principles is akin to the thief and embezzler telling the judge when caught red-handed, Your honor, this is not who I am. Contrary to the delusions of the delirious, slaveholding was not an accident post-1776: How else to explain the exponential increase in the number of enslaved leading up to the Civil War? How else to explain U.S. and Texiannationals coming to dominate the slave trade in Cuba, Brazil, etc.?

You write that 1836 was a civil war over slavery and, like a precursor of Typhoid Mary, Texas seemed to bring the virulent bacteria that was war to whatever jurisdiction it joined. Of course, Texas ultimately joined the United States. How did slave-owning Texas infect the United States?

Texas was a bulwark of the so-called Confederate States of America which seceded from the U.S. in 1861 not least to preserveand extendenslavement of Africans in the first place. The detritus of Texas slaveholders became a bulwark of the Ku Klux Klan which served to drown Reconstructionor the post-Civil War steps to deliver a measure of freedom to the formerly enslavedin blood. This Texas detritus were stalwart backers in the 20th century of the disastrous escapades of McCarthyism, which routed not just communists but numerous labor organizers and anti-Jim Crow advocates. Texas also supplied a disproportionate percentage of the insurrectionists who stormed Capitol Hill on 6 January 2021.

Your book is subtitled The Roots of U.S. Fascism. Theres a growing awareness among pundits and some political leaders, President Biden, for instance, of the rise of fascist or fascist-like politics in the United Statesa politics of racist nationalism, trading in perceived grievances and centered on devotion to an autocratic leader. Your book argues that todays American fascism has roots as far back as the Anglo settlement of Mexican Texas. Why do you think so?

The genocidal and enslaving impulse has been essential to fascism whenever it has reared its ugly head globally. In Texasas in the wider republicthis involved class collaboration between and among a diverse array of settlers for mutual advantage. This class collaboration persists to this very day and can be espied on 6, January 2021 and thereafter.

Read more:

1836, the Slaveholder Republic's Birthday - The Texas Observer

General officers in the Confederate States Army – Wikipedia

Senior military leaders of the Confederate States of America

The general officers of the Confederate States Army (CSA) were the senior military leaders of the Confederacy during the American Civil War of 18611865. They were often former officers from the United States Army (the regular army) prior to the Civil War, while others were given the rank based on merit or when necessity demanded. Most Confederate generals needed confirmation from the Confederate Congress, much like prospective generals in the modern U.S. armed forces.

Like all of the Confederacy's military forces, these generals answered to their civilian leadership, in particular Jefferson Davis, the South's president and therefore commander-in-chief of the Army, Navy, and the Marines of the Confederate States.

Much of the design of the Confederate States Army was based on the structure and customs of the U.S. Army[1] when the Confederate Congress established their War Department on February 21, 1861.[2] The Confederate Army was composed of three parts; the Army of the Confederate States of America (ACSA, intended to be the permanent, regular army), the Provisional Army of the Confederate States (PACS, or "volunteer" Army, to be disbanded after hostilities), and the various Southern state militias.

Graduates from West Point and Mexican War veterans were highly sought after by Jefferson Davis for military service, especially as general officers. Like their Federal counterparts, the Confederate Army had both professional and political generals within it. Ranks throughout the CSA were roughly based on the U.S. Army in design and seniority.[3] On February 27, 1861, a general staff for the army was authorized, consisting of four positions: an adjutant general, a quartermaster general, a commissary general, and a surgeon general. Initially the last of these was to be a staff officer only.[2] The post of adjutant general was filled by Samuel Cooper (the position he had held as a colonel in the U.S. Army from 1852 until resigning) and he held it throughout the Civil War, as well as the army's inspector general.[4]

Initially, the Confederate Army commissioned only brigadier generals in both the volunteer and regular services;[2] however, the Congress quickly passed legislation allowing for the appointment of major generals as well as generals, thus providing clear and distinct seniority over the existing major generals in the various state militias.[5] On May 16, 1861, when there were only five officers at the grade of brigadier general, this legislation was passed, which stated in part:

That the five general officers provided by existing laws for the Confederate States shall have the rank and denomination of 'general', instead of 'brigadier-general', which shall be the highest military grade known to the Confederate States ...[6]

As of September 18, 1862, when lieutenant generals were authorized, the Confederate Army had four grades of general officers; they were (in order of increasing rank) brigadier general, major general, lieutenant general, and general.[7] As officers were appointed to the various grades of general by Jefferson Davis (and were confirmed), he would create the promotion lists himself. The dates of rank, as well as seniority of officers appointed to the same grade on the same day, were determined by Davis "usually following the guidelines established for the prewar U.S. Army."[8]

These generals were most often infantry or cavalry brigade commanders, aides to other higher ranking generals, and War Department staff officers. By war's end the Confederacy had at least 383 different men who held this rank in the PACS, and three in the ACSA: Samuel Cooper, Robert E. Lee, and Joseph E. Johnston.[9] The organization of regiments into brigades was authorized by the Congress on March 6, 1861. Brigadier generals would command them, and these generals were to be nominated by Davis and confirmed by the Confederate Senate.[2]

Though close to the Union Army in assignments, Confederate brigadiers mainly commanded brigades while Federal brigadiers sometimes led divisions as well as brigades, particularly in the first years of the war. These generals also often led sub-districts within military departments, with command over soldiers in their sub-district. These generals outranked Confederate Army colonels, who commonly led infantry regiments.

This rank is equivalent to brigadier general in the modern U.S. Army.

These generals were most commonly infantry division commanders, aides to other higher ranking generals, and War Department staff officers. They also led the districts that made up military departments and had command over the troops in their districts. Some Major generals also led smaller military departments. By war's end, the Confederacy had at least 88 different men who had held this rank, all in the PACS.[10]

Divisions were authorized by the Congress on March 6, 1861, and major generals would command them. These generals were to be nominated by Davis and confirmed by the Senate.[2] Major generals outranked brigadiers and all other lesser officers.

This rank was not synonymous with the Union's use of it, as Northern major generals led divisions, corps, and entire armies. This rank is equivalent in most respects to major general in the modern U.S. Army.

Not further promoted

Evander Mclver Law was promoted to the rank of Major General on March 20, 1865; on the recommendation of Generals Johnston and Hampton just before the surrender. The promotion was too late to be confirmed by the Confederate Congress however.

There were 18 lieutenant generals in the Confederate Army, and these general officers were often corps commanders within armies or military department heads, in charge of geographic sections and all soldiers in those boundaries. All of the Confederacy's lieutenant generals were in the PACS.[10] The Confederate Congress legalized the creation of army corps on September 18, 1862, and directed that lieutenant generals lead them. These generals were to be nominated by President Davis and confirmed by the C.S. Senate.[7] Lieutenant generals outranked major generals and all other lesser officers.

This rank was not synonymous with the Federal use of it; Ulysses S. Grant (18221885) was one of only two Federal lieutenant generals during the war, the other being Winfield Scott (17861866), General-in-Chief of the United States Army 18411861, at the beginning of the American Civil War who also served in the War of 1812 (18121815), and led an army in the field during the MexicanAmerican War (18461849), received a promotion to brevet lieutenant general by a special Act of Congress in 1855. Gen. Grant was by the time of his promotion, March 9, 1864, the only Federal lieutenant general in active service. Grant became General-in-Chief, commander of the United States Army and of all the Union armies, answering directly to President Abraham Lincoln and charged with the task of leading the Federal armies to victory over the southern Confederacy. The CSA lieutenant general rank is also roughly equivalent to lieutenant general in the modern U.S. Army.

The Congress passed legislation in May 1864 to allow for "temporary" general officers in the PACS, to be appointed by President Jefferson Davis and confirmed by the C.S. Senate and given a non-permanent command by Davis.[12] Under this law, Davis appointed several officers to fill open positions. Richard H. Anderson was appointed a "temporary" lieutenant general on May 31, 1864, and given command of the First Corps in the Army of Northern Virginia commanded by Gen. Lee (following the wounding of Lee's second-in-command, Lt. Gen. James Longstreet on May 6 in the Battle of the Wilderness.) With Longstreet's return that October, Anderson reverted to a major general. Jubal Early was appointed a "temporary" lieutenant general on May 31, 1864, and given command of the Second Corps (following the reassignment of Lt. Gen. Richard S. Ewell to other duties) and led the Corps as an army into the third Southern invasion of the North in July 1864 with battles at the Monocacy near Frederick, Maryland and Fort Stevens outside the Federal capital city of Washington, D.C., until December 1864, when he too reverted to a major general. Likewise, both Stephen D. Lee and Alexander P. Stewart were appointed to fill vacancies in the Western Theater as "temporary" lieutenant generals and also reverted to their prior grades as major generals as those assignments ended. However, Lee was nominated a second time for lieutenant general on March 11, 1865.[13]

Originally five officers in the South were appointed to the rank of general, and only two more would follow. These generals occupied the senior posts in the Confederate Army, mostly entire army or military department commanders, and advisers to Jefferson Davis. This rank is equivalent to the general in the modern U.S. Army, and the grade is often referred to in modern writings as "full general" to help differentiate it from the generic term "general" meaning simply "general officer".[15]

All Confederate generals were enrolled in the ACSA to ensure that they outranked all militia officers,[5] except for Edmund Kirby Smith, who was appointed general late in the war and into the PACS. Pierre G.T. Beauregard, had also initially been appointed a PACS general, was elevated to ACSA two months later with the same date of rank.[16] These generals outranked all other grades of generals, as well as all lesser officers in the Confederate States Army.

The first group of officers appointed to general was Samuel Cooper, Albert Sidney Johnston, Robert E. Lee, Joseph E. Johnston, and Pierre G.T. Beauregard, with their seniority in that order. This ordering caused Cooper, a staff officer who would not see combat, to be the senior general officer in the CSA. That seniority strained the relationship between Joseph E. Johnston and Jefferson Davis. Johnston considered himself the senior officer in the Confederate States Army and resented the ranks that President Davis had authorized. However, his previous position in the U.S. Army was staff, not line, which was evidently a criterion for Davis regarding establishing seniority and rank in the subsequent Confederate States Army.[17]

On February 17, 1864, legislation was passed by Congress to allow President Davis to appoint an officer to command the Trans-Mississippi Department in the Far West, with the rank of general in the PACS. Edmund Kirby Smith was the only officer appointed to this position.[18] Braxton Bragg was appointed a general in the ACSA with a date of rank of April 6, 1862, the day his commanding officer Gen. Albert Sidney Johnston died in combat at Shiloh/Pittsburg Landing.[19]

The Congress passed legislation in May 1864 to allow for "temporary" general officers in the PACS, to be appointed by Davis and confirmed by the C.S. Senate and given a non-permanent command by Davis.[12]John Bell Hood was appointed a "temporary" general on July 18, 1864, the date he took command of the Army of Tennessee in the Atlanta Campaign, but this appointment was not later confirmed by the Congress, and he reverted to his rank of lieutenant general in January 1865.[20] Later in March 1865, shortly before the end of the war, Hood's status was spelled out by the Confederate States Senate, which stated:

Resolved, That General J. B. Hood, having been appointed General, with temporary rank and command, and having been relieved from duty as Commander of the Army of Tennessee, and not having been reappointed to any other command appropriate to the rank of General, he has lost the rank of General, and therefore cannot be confirmed as such.[21]

Note that during 1863, Beauregard, Cooper, J. Johnston, and Lee all had their ranks re-nominated on February 20 and then re-confirmed on April 23 by the Confederate Congress.[13] This was in response to debates on February 17 about whether confirmations made by the provisional legislature needed re-confirmation by the permanent legislature, which was done by an Act of Congress issued two days later.[22]

The position of General in Chief of the Armies of the Confederate States was created on January 23, 1865. The only officer appointed to it was Gen. Robert E. Lee, who served from February 6 until April 12.

The Southern states had had militias in place since Revolutionary War times consistent with the U.S. Militia Act of 1792. They went by varied names such as State "Militia" or "Armies" or "Guard" and were activated and expanded when the Civil War began. These units were commanded by "Militia Generals" to defend their particular state and sometimes did not leave native soil to fight for the Confederacy. The Confederate militias used the general officer ranks of Brigadier General and Major General.

The regulations in the Act of 1792 provided for two classes of militia, divided by age. Class one was to include men from 22 to 30 years old, and class two would include men from 18 to 20 years as well as from 31 to 45 years old.[23] The various southern states were each using this system when the war began.

All Confederate generals wore the same uniform insignia regardless of which rank of general they were,[24] except for Robert E. Lee who wore the uniform of a Confederate colonel. The only visible difference was the button groupings on their uniforms; groups of three buttons for lieutenant and major generals, and groups of two for brigadier generals. In either case, a general's buttons were also distinguished from other ranks by their eagle insignia.

To the right is a picture of the CSA general's full uniform, in this case of Brig. Gen. Joseph R. Anderson of the Confederacy's Ordnance Department. All of the South's generals wore uniforms like this regardless of which grade of general they were, and all with gold-colored embroidering.

The general officers of the Confederate Army were paid for their services, and exactly how much (in Confederate dollars (CSD)) depended on their rank and whether they held a field command or not. On March 6, 1861, when the army only contained brigadier generals, their pay was $301 CSD monthly, and their aide-de-camp lieutenants would receive an additional $35 CSD per month beyond regular pay. As more grades of the general officer were added, the pay scale was adjusted. By June 10, 1864, a general received $500 CSD monthly, plus another $500 CSD if they led an army in the field. Also, by that date, lieutenant generals got $450 CSD and major generals $350 CSD, and brigadiers would receive $50 CSD in addition to regular pay if they served in combat.[25]

The CSA lost more general officers killed in combat than the Union Army did throughout the war, in the ratio of about 5-to-1 for the South compared to roughly 12-to-1 in the North.[26] The most famous of them is General Thomas "Stonewall" Jackson, probably the best known Confederate commander after General Robert E. Lee.[27] Jackson's death was the result of pneumonia which emerged subsequently after a friendly fire incident had occurred at Chancellorsville on the night of May 2, 1863. Replacing these fallen generals was an ongoing problem during the war, often having men promoted beyond their abilities (a common criticism of officers such as John Bell Hood[28] and George E. Pickett,[29] but an issue for both armies), or gravely wounded in combat but needed, such as Richard S. Ewell.[30] The problem was made more difficult by the South's depleting manpower, especially near the war's end.

The last Confederate general in the field, Stand Watie, surrendered on June 23, 1865, and the war's last surviving full general, Edmund Kirby Smith, died on March 28, 1893.[31] James Longstreet died on January 2, 1904, and was considered "the last of the high command of the Confederacy".[32]

The Confederate Army's system of using four grades of general officers is currently the same rank structure used by the U.S. Army (in use since shortly after the Civil War) and is also the system used by the U.S. Marine Corps (in use since World War II).

View original post here:

General officers in the Confederate States Army - Wikipedia

President Biden Announces Key Appointments to Boards and Commissions – The White House

WASHINGTON Today, President Biden announced his intent to appoint the following individuals to serve in key roles:

Council of the Administrative Conference of the United StatesAdministrative Conference of the United States (ACUS) is an independent federal agency charged with convening expert representatives from the public and private sectors to recommend improvements to administrative process and procedure. ACUS initiatives promote efficiency, participation, and fairness in the promulgation of federal regulations and in the administration of federal programs. The ten-member ACUS Council is composed of government officials and private citizens.

Kristen Clarke, Member, Council of the Administrative Conference of the United StatesKristen Clarke is the Assistant Attorney General for Civil Rights at the U.S. Department of Justice. In this role, she leads the Justice Departments broad federal civil rights enforcement efforts and works to uphold the civil and constitutional rights of all who live in America. Clarke is a lifelong civil rights lawyer who has spent her entire career in public service. She most recently served as President and Executive Director of the Lawyers Committee for Civil Rights Under Law, one of the nations leading civil rights organizations founded at the request of John F. Kennedy.

Fernando Raul Laguarda, Member, Council of the Administrative Conference of the United StatesFernando Laguarda is General Counsel at AmeriCorps. Prior to his current role, he was Faculty Director of the Program on Law and Government and a Professor at American University Washington College of Law, where he taught and developed courses in administrative law, legislation, and antitrust, and launched the law schools LLM in Legislation. Laguarda also founded the nations first student-centered initiative to study the work of government oversight entities and was faculty advisor to the Latino Law Students Association. Fernando has worked in the telecommunications industry and as a partner at two different Washington, D.C. law firms focusing on technology and competition law. He was a Founder, served as General Counsel, and eventually became Board Chair, of the National Network to End Domestic Violence. Laguarda has also served as a member of numerous non-profit, civil rights, academic, and advisory boards. Laguarda received his J.D. cum laude from Georgetown University Law Center and his A.B. cum laude in government from Harvard College.

Anne Joseph OConnell, Member, Council of the Administrative Conference of the United StatesAnne Joseph OConnell, a lawyer and social scientist, is the Adelbert H. Sweet Professor of Law at Stanford University. Her research and teaching focuses on administrative law and public administration. She is a three-time recipient of the American Bar Associations Scholarship Award in Administrative Law for the best article or book published in the preceding year, and a two-time winner of the Richard D. Cudahy Writing Competition on Regulatory and Administrative Law from the American Constitution Society. OConnell joined the Gellhorn and Byses Administrative Law: Cases and Comments casebook as a co-editor with the twelfth edition. Most recently, her work has focused on acting officials and delegations of authority in federal agencies. Her research has been cited by Congress, the Supreme Court, lower federal courts, and the national media. She is an elected fellow of the American Academy of Arts and Sciences and the National Academy of Public Administration.

Before entering law school teaching, OConnell clerked for Justice Ruth Bader Ginsburg and Judge Stephen F. Williams and served as a trial attorney for the Federal Programs Branch of the Department of Justices Civil Division. A Truman Scholar, she worked for a number of federal agencies in earlier years. OConnell received a B.A. in Mathematics from Williams College, an M.Phil. in the History and Philosophy of Science from Cambridge University, a J.D. from Yale Law School, and a Ph.D. in Political Economy and Government from Harvard University.

Jonathan Su, Member, Council of the Administrative Conference of the United StatesJonathan Su most recently served as Deputy Counsel to the President. Prior to his service at the White House, Su was the Deputy Office Managing Partner of the Washington, D.C. office of Latham & Watkins LLP, where he was also a partner in the White Collar Defense & Investigations practice. During the Obama-Biden Administration, Su served as Special Counsel to the President. Su was also a federal prosecutor at the United States Attorneys Office for the District of Maryland. He served as a law clerk for U.S. Circuit Judge Ronald M. Gould and U.S. District Judge Julian Abele Cook, Jr. Su is a graduate of the University of California at Berkeley and Georgetown University Law Center.

National Capital Planning CommissionEstablished by Congress in 1924, the National Capital Planning Commission (NCPC) is the federal governments central planning agency for the National Capital Region. Through planning, policymaking, and project review, NCPC protects and advances the federal governments interest in the regions development. The Commission provides overall planning guidance for federal land and buildings in the region by reviewing the design of federal and certain local projects, overseeing long-range planning for future development, and monitoring capital investment by federal agencies. The 12-member Commission represents federal and local constituencies with a stake in planning for the nations capital.

Bryan Clark Green, Commissioner, National Capital Planning CommissionBryan Green leverages his expertise as an educator, writer, and practicing preservationist to embrace the role of architecture in Americas larger story. He began his career at the Virginia Historical Society, worked for the Virginia Department of Historic Resources, was a Senior Associate and Director of Historic Preservation at Commonwealth Architects. He later joined the Tidewater and Big Bend Foundation as Executive Director. Green is the author of the forthcoming work, In Jeffersons Shadow: The Architecture of Thomas R. Blackburn, co-author of Lost Virginia: Vanished Architecture of the Old Dominion, After the Monuments Fall: The Removal of Confederate Monuments from the American South (LSU Press), with Kathleen James-Chakraborty and Katherine Kuenzli. Green graduated from the University of Notre Dame with a Bachelors in History and obtained his Masters and Ph.D. in Architectural History at the University of Virginia.

He serves as Chair, Preservation Officer, and ex officio member the Board at the Heritage Conservation Committee of the Society of Architectural Historians. He co-chairs the Publications Committee of the Association Preservation Technology International and serves on the Commonwealth of Virginias Citizens Advisory Council on Furnishing and Interpreting the Executive Mansion, and formerly served on the City of Richmond Commission of Architectural Review and Urban Design committees. Greens longstanding commitment to this work led him to Honorary Membership in both the Virginia Society and the Richmond Chapter of the American Institute of Architects.

Elizabeth M. Hewlett, Commissioner, National Capital Planning CommissionElizabeth M. Hewlett is an attorney and servant of the public interest. She recently retired from her second tenure as the Chairman of the Prince Georges County Planning Board and the Maryland-National Capital Park and Planning Commission. She has represented Maryland on the Washington Metropolitan Area Transit Authority and served as a Principal at Shipley, Horne and Hewlett, P.A., a law firm where she represented individuals, businesses, and real estate clients while also rendering many community-centric pro bono services. Hewlett has participated in or led dozens of public boards, civic groups, and key initiatives, including the Prince Georges County Census effort, the Maryland State Board of Law Examiners, and as a member of the Governors Drug and Alcohol Abuse Commission.

Throughout her career, Hewlett has also been a contributor to several legal and professional organizations, including: National Bar Association, Womens Bar Association of Maryland, the J. Franklyn Bourne Bar Association, the National Association for the Advancement of Colored People, and Delta Sigma Theta Sorority, Inc. She has been awarded many awards, including the Wayne K. Curry Distinguished Service Award, the National Bar Association Presidential Lifetime Achievement Award, and the J. Joseph Curran Award for Public Service. She is a graduate of Tufts University, Boston College Law School, and the John F. Kennedy School of Government Executive Program at Harvard University.

Presidents Intelligence Advisory BoardThe Presidents Intelligence Advisory Board is an independent element within the Executive Office of the President. The Presidents Intelligence Advisory Board exists exclusively to assist the President by providing the President with an independent source of advice on the effectiveness with which the Intelligence Community is meeting the nations intelligence needs and the vigor and insight with which the community plans for the future. The President is able to appoint up to 16 members of the Board.

Anne M. Finucane, Member, Presidents Intelligence Advisory BoardAnne Finucane currently serves as Chairman of the Board for Bank of America Europe. She also serves on the board of Bank of America Securities Europe SA, the banks EU broker-dealer in Paris. Finucane served as the first woman Vice Chairman of Bank of America. She led the companys strategic positioning and global sustainable and climate finance work, environmental, social and governance (ESG), capital deployment and public policy efforts. She is widely recognized for pioneering sustainable finance in the banking industry. For most of her career, Finucane also oversaw marketing, communications, and data and analytics at the company, and is credited with leading Bank of Americas successful efforts to reposition the company and repair its reputation after the 2008 financial crisis.

Finucane serves on a variety of corporate and nonprofit boards of directors, including CVS Health, Williams Sonoma, Mass General Brigham Healthcare, Special Olympics, the (RED) Advisory Board, and the Carnegie Endowment for International Peace. She previously served on the U.S. State Departments Foreign Affairs Policy board and is a member of the Council on Foreign Relations. Finucane has consistently been highlighted in most powerful women lists, including in the American Banker, Fortune, and Forbes. In 2021, she received the Carnegie Hall Medal of Honor, and in 2019 she was inducted into the American Advertising Federations Advertising Hall of Fame, and received the Edward M. Kennedy Institute for Inspired Leadership.

###

Continued here:

President Biden Announces Key Appointments to Boards and Commissions - The White House

8 Best Canadian Whiskies of 2022 – HICONSUMPTION

In a world of whiskies where identity is key, Canadian whisky might just suffer by its ability to do everything. From making a Scotch-style single malt to an American-style bourbon, distilleries from our northern neighbors thrive of that very versatility, opening the doors to creativity and innovation. Luckily, in recent years, Canadian liquor has been on the rise Stateside. Its yet to build up the exotic cachet of Scotch or Japanese whisky, but were confident that its only a matter of time. To help you get started, weve compiled a guide to the best Canadian whiskies to drink right now.

And Rye Is It So Good?

Although its often called rye whisky, Canadian rye whisky is much different than American rye whiskey (other than the added e), which can contain as much as 100% rye in the mashbill. For one, the rye in Canadian whisky refers to the grain being added to a predominantly-corn mashbill. Whereas most popular whisky-making regions (think Scotland, Ireland, Japan, and the United States) specialize in a certain style or styles brought on by the prominence of a specific grain or still type Canada is known for its eclectic variety and is frequently blended from different styles.

That said, there are some legal stipulations pinned to making Canadian whisky thanks to the nations Food and Drug Act. Most importantly, the liquor is required to be mashed, distilled, and aged in Canada. Additionally, it must be aged in small wood vessels for at least three years and bottled at 40% ABV. Unlike many other regions, caramel may be added for flavoring as long as it doesnt lose the aroma and taste generally attributed to Canadian whisky.

Slow But Steady

Around since the 1700s, Canadian whisky mostly began as a wheat spirit, since thats what primarily grew in the country at the time. Rye was added for flavor, thus creating what would become the profile and identity of the spirit for some time. The liquor really started to boom in the 19th century in England, who was having trouble sourcing their whisky elsewhere. Later on, during the Civil War in the United States, the North looked to Canada to supply them with their liquor since they refused to buy products from the Confederate states, which happened to be the source of most of the whiskey in the country.

The first real nation to enact an aging requirement, which was only one year in 1887 before eventually increasing to three, Canadian whisky was able to capitalize on the repeal of Prohibition since many U.S. distilleries had shut down and consumers wanted something besides the bootleg whiskey they had been drinking for 13 years. Likewise, a lot of their products had been aging in barrels waiting for the demand to return. Like most spirits (other than vodka), wine and beer were the favored alcoholic drink throughout the 70s and 80s until 1992 when Forty Creek reclaimed what Canadian whisky could be.

Launched in 1946, Albert Distillers started making rye whisky a couple decades after it went out of style and long before it was cachet again. A few years ago, the number-one rye producer in North America decided to do something a little different. Where its contemporaries were finishing their whiskies in former wine casks, Alberta was putting it straight in the batch, blending 91% rye, 8% bourbon, and 1% sherry to make its Dark Batch, which rides on a profile of vanilla, oak, dried stone fruit, citrus, and baking spices.

Lot 40 was created by Corby Spirit and Wine in 1998 as a limited-edition homage to pre-Prohibition-style rye whisky. After the resurgence of rye, it was launched as its own brand in 2012 and has since become one of the most decorated Canadian whiskies. Utilizing a mashbill of 100% unmalted rye, Lot 40, which gets its namesake from the plot of land owned by one of its founders, is distilled in copper pot stills one batch at a time and aged in new American oak barrels much similar to bourbon. The result is a dry and complex profile of spice, dark fruit, and citrus.

Since 2011, British Columbia-based distillery Shelter Point has made all of its whiskies with the barley thats grown on its own 380-acre property and water from a river that runs through its estate. Its highly-popular small-batch Smoke Point expression takes after the peated single malts from Scotland. Made from pot stills, Batch #3 has already won a plethora of awards this year, including Double Gold at the San Francisco World Spirits Competition and Best Single Malt at the Canadian Whisky Awards.

With 165 years of whisky-making experience, J.P. Wisers is one of the oldest operating distilleries in the nation. Thanks to the low-rye mashbill, this 18-year-old corn whisky the brands highest age statement is double column distilled, blended, and aged for nearly two decades in Canadian oak casks. Perfect for sipping neat, this expression goes down super smooth with a dynamic palate of pine, oak, apple, and floral notes with a long finish. And with a sub-$60 price point, this is one of the best deals youll find in any liquor category.

What Jack Daniels is to American whisky, Crown Royal is to our neighbors to the north. Easily Canadas most recognizable brand, the Gimli giant has been heading in a new premium direction as of late. However, that purple bag and picturesque bottles have always come underpinned with an air of elegance. This most recent version of the Noble Collections Winter Wheat Blended whisky has been the brands hottest batch as of late, even winning Best Whisky Overall at the Canadian Whiskey Awards back in February.

Since its launch in 1992, Forty Creek has been paving the way for Canadian whisky with its resilient approach to thinking outside the box. Credited with helping revive the national spirit, Forty Creeks small-batch Confederation Oak Reserve, named after the Canadian Confederation of 1867, blends together three spirts of different ages, made from a mashbill of corn, rye, and barley, and then finished for two years in Canadian oak casks. The colder weather imparts a profile of vanilla, butter cream, pepper, and walnut.

Billed as Canadas first single-barrel whisky, this marvelous expression from Caribou Crossing comes from one of around 200,000 casks in the distillerys collection. Bourbon lovers might compare its caribou bottle topper to Blantons galloping horse, but the flavor profile can stand toe-to-toe as well. Easily one of the most prolific top-shelf options from the Great White North, Caribou Crossings Single Barrel soars with a slightly-fluctuating medium-body profile of vanilla, honey, pepper, and fruit.

Rye typically matures much faster that corn- or barley-based whiskies. Nevertheless, the folks at Lock Stock & Barrel have found magic in their process, utilizing a mashbill of 100% rye. The brands top-shelf 21 Year was double distilled in copper pot stills before being aged for over two decades in new charred American oak barrels. Bottled at 111 proof, this whisky has a definite heat undergirding notes of cinnamon, caramel, cocoa, anise, and treacle, giving way to a long finish of leather, oak, and spice.

Original post:

8 Best Canadian Whiskies of 2022 - HICONSUMPTION

Inside Lake Lanier’s Deaths And Why People Say It’s Haunted – All That’s Interesting

Constructed right atop the historically Black town of Oscarville, Georgia in 1956, Lake Lanier has become one of the most dangerous bodies of water in America with the remains of buildings just below the surface ensnaring hundreds of boats and swimmers.

Each year, more than 10 million people visit Lake Lanier in Gainesville, Georgia. Though unsuspecting the massive, placid lake might look, its considered one of the deadliest in America indeed, there have been 700 deaths at Lake Lanier since its construction in 1956.

This shocking number of accidents at the lake have led many to theorize that the site may, in fact, be haunted.

And given the controversial circumstances surrounding the lakes construction and a history of racial violence in the ruins of the former town of Oscarville that lie beneath the lakes surface, there might be some truth to this idea.

In 1956, the United States Army Corps of Engineers was tasked with creating a lake to provide water and power to parts of Georgia and help to prevent the Chattahoochee River from flooding.

They chose to construct the lake near Oscarville, in Forsyth County. Named after the poet and Confederate soldier Sidney Lanier, Lake Lanier has 692 miles of shoreline, making it the largest in Georgia and far, far larger than the town of Oscarville, which the Corps of Engineers forcefully emptied so that the lake could be built.

In total, 250 families were displaced, roughly 50,000 acres of farmland were destroyed, and 20 cemeteries were either relocated or otherwise engulfed by the lakes waters over its five-year construction period.

The town of Oscarville, however, was strangely not demolish before the lake was filled, and its ruins still rest at the bottom of Lake Lanier.

Divers have reported finding fully intact streets, walls, and houses, making it the single most dangerous underwater surface in the United States.

The flooded structures, coupled with declining water levels, are presumed to be a major factor in the high number of deaths that occur yearly at Lake Lanier, catching swimmers and holding them under or damaging boats with debris.

The deaths at Lake Lanier arent the typical sort, though. While there are many instances of people drowning, there are also reports of boats randomly going up in flames, freak accidents, missing persons, and inexplicable tragedies.

Some believe the regions dark past is responsible for these incidents. Legend asserts that vengeful and restless spirits of those whose graves were flooded many of whom were Black or persecuted and driven out by violent white mobs is behind this curse.

The town of Oscarville was once a bustling, turn-of-the-century community and a beacon for Black culture in the South. At the time, 1,100 Black people owned land and operated businesses in Forsyth County alone.

But on Sept. 9, 1912, an 18-year-old white woman named Mae Crow was raped and murdered near Browns Bridge on the Chattahoochee River banks, right by Oscarville.

According to the Oxford American, Mae Crows murder was pinned on four young Black people who happened to live in the area nearby; siblings Oscar and Trussie Jane Daniel, only 18 and 22 respectively, and their 16-year-old cousin Ernest Knox. With them was Robert Big Rob Edwards, 24.

Edwards was arrested for Crows rape and murder and taken to jail in Cumming, Georgia, the seat of Forsyth County.

A day later, a white mob invaded Edwards jail cell. They shot him, dragged him through the streets, and hanged him from a telephone pole outside the courthouse.

A month later, Ernest Knox and Oscar Daniel appeared in court for the rape and murder of Mae Crow. They were found guilty by the jury in just over an hour.

Some 5,000 people gathered to watch the teenagers be hanged.

Trussie Daniels charges were dismissed, but its widely believed that all three boys were innocent of the crimes.

Following Edwards lynching, white mobs known as night riders started going door to door across Forsyth County with torches and guns, burning down Black businesses and churches, demanding that all Black citizens vacate the county.

As Narcity reported, to this day less than five percent of Forsyth Countys population is Black.

But perhaps Lake Lanier is haunted by some other force?

The most popular legend surrounding Lake Lanier is called The Lady of the Lake.

As the story goes, in 1958, two young girls named Delia May Parker Young and Susie Roberts were at a dance in town but had decided to leave early. On the way home, they stopped to get gas and then left without paying for it.

They were driving across a bridge over Lake Lanier when they lost control of the car, spiraling off the edge and crashing into the dark waters below.

A year later, a fisherman out on the lake came across a decomposed, unrecognizable body floating near the bridge. At the time, no one could identify who it belonged to.

It wasnt until 1990 when officials discovered a 1950s Ford sedan at the bottom of the lake with the remains of Susie Roberts inside, that they were able to identify the body found three decades earlier as Delia May Parker Youngs.

But locals already knew who she was. They had reportedly seen her, still in her blue dress, wandering near the bridge at night with handless arms, waiting to drag unsuspecting lake-goers to the bottom.

Other people have reported seeing a shadowy figure sitting on a raft, inching himself across the water with a long pole and holding up a lantern to see.

Besides these ghost stories of yore, there are those who claim that the lake is haunted by the spirits of the 27 victims who have died in Lake Lanier over the years, but whose bodies were never found.

In the end, though, ghost stories are perhaps nothing more than a fun way to write off an otherwise tragic history littered with racist violence as well as unsafe and poorly planned construction.

Regardless of its size, for 700 people to have died in the lake in less than 70 years, something must be wrong. The Army Corps of Engineers initially believed that the submerged town of Oscarville wouldnt cause any harm, but the lake also wasnt constructed to be recreational it was meant to supply water from the Chattahoochee River to towns and cities in Georgia.

Many of the deaths can likely be attributed to things as simple as not wearing a life jacket, drinking alcohol while out on the lake, accidents, or incorrectly assuming that shallow water is always safe.

Perhaps the only thing that truly haunts Lake Lanier is its bigoted past.

After reading about the deaths at Lake Lanier and the history of Lake Lanier, learn about Ohios Franklin Castle, which quickly became a house of horrors. Then, see the twisted, dark history of the Myrtles Plantation in Louisiana.

Read this article:

Inside Lake Lanier's Deaths And Why People Say It's Haunted - All That's Interesting

Gene therapy can make a real impact on global health but we need equitable access, say experts – World Economic Forum

Low- and middle-income countries (LMICs) can and should play a leading role in dictating the future of the worlds most advanced healthcare technologies, according to the World Economic Forums Accelerating Global Access to Gene Therapies: Case Studies from Low- and Middle-Income Countries white paper.

Gene therapy is at the forefront of modern medicine. By making precise changes to the human genome, these sophisticated technologies can potentially lead to one-time lifelong cures for infectious and non-communicable diseases (e.g. HIV, sickle cell disease) that affect tens of millions of people around the globe, most of whom live in LMICs. However, too often the benefits of advanced healthcare technologies remain restricted to high-income countries (HICs), a reality that could happen to gene therapies.

The narrative that new healthcare technologies are unsuitable for LMICs is a long-standing rationale for excluding a majority of the world from the benefits of modern medicine. Without concerted efforts to build gene therapy capacity in LMICs, the global health divide will continue to widen.

The gene therapy industry is in its infancy, but early clinical successes and substantial funding have generated enormous momentum. This is an ideal moment for LMICs to enter the global market, prioritizing the needs of communities carrying the highest disease burdens.

We asked five clinical researchers from LMICs, who are all co-authors on the recent white paper, what innovations on the ground and changes at policy-level need to happen for gene therapy to make a real impact on global health.

Dr. Cissy Kityo Mutuluza, Executive Director, Joint Clinical Research Centre, Uganda

Although gene therapy has the potential to treat or even cure life-limiting diseases and infections, the full impact will only be realized if we deliver it for the benefit of all people, instead of fueling more health inequity between and within countries.

An essential first step towards maximizing the global impact of gene therapies is to build research and development (R&D) capacity in LMICs. Current gene therapy R&D has mainly excluded LMICs, instead centering pre-clinical and clinical work in HICs. Gene therapy R&D needs to be performed in regions where target diseases are prevalent to ensure that these therapies are safe and effective for those populations. Manufacturing technologies and healthcare infrastructure, which are the cost drivers for gene therapy products in HICs, need to be replaced with innovative and simplified platforms and workflows that bring down costs and are functional and cost-effective within LMIC health systems.

As for policy and regulation, individual countries must establish gene therapy frameworks that enable R&D. The construction of such frameworks should be guided by recommendations from the World Health Organization, emphasizing safety, effectiveness and ethics.

A critical component in effective global health interventions is community outreach. Treatment acceptability is essential for future clinical trials, thus it is important for scientists and clinicians to be clear about the risks and benefits of gene therapies. Communication and education activities should be made accessible to a broad range of stakeholders. Gene therapy and gene editing technologies are complex and it can be difficult for the public to understand their possible benefits or side effects. However, patient and public support is critical for the successful adoption of any new technology.

Professor Johnny Mahlangu, University of the Witwatersrand, South Africa

The ongoing COVID-19 pandemic is accelerating innovation, implementation and acceptance of molecular therapeutics (e.g. mRNA vaccines) globally. As a result, there is escalating interest in developing molecular interventions for many other conditions, such as gene therapies for genetic diseases. Strategically leveraging infrastructure that is being developed for molecular therapeutics will be critical in manufacturing, testing, and delivering gene therapies across diverse settings. Three critical areas of consideration include:

The application of precision medicine to save and improve lives relies on good-quality, easily-accessible data on everything from our DNA to lifestyle and environmental factors. The opposite to a one-size-fits-all healthcare system, it has vast, untapped potential to transform the treatment and prediction of rare diseasesand disease in general.

But there is no global governance framework for such data and no common data portal. This is a problem that contributes to the premature deaths of hundreds of millions of rare-disease patients worldwide.

The World Economic Forums Breaking Barriers to Health Data Governance initiative is focused on creating, testing and growing a framework to support effective and responsible access across borders to sensitive health data for the treatment and diagnosis of rare diseases.

The data will be shared via a federated data system: a decentralized approach that allows different institutions to access each others data without that data ever leaving the organization it originated from. This is done via an application programming interface and strikes a balance between simply pooling data (posing security concerns) and limiting access completely.

The project is a collaboration between entities in the UK (Genomics England), Australia (Australian Genomics Health Alliance), Canada (Genomics4RD), and the US (Intermountain Healthcare).

Professor Vikram Mathews, Christian Medical College, Vellore, India

Gene therapy is on course to revolutionize medical care for several conditions. The hope is that gene therapy will be a one-time curative therapeutic intervention for diseases ranging from inherited hemoglobinopathies, such as sickle cell disease and thalassemia, to acquired diseases such as HIV.

A primary challenge limiting access to these life-saving therapies is their astronomical costs, making them inaccessible even in developed countries where most gene therapies have originated. Due to economic challenges, there is often a mismatch between regions in the world where development and clinical research happens versus regions in the world where the incidence of the disease target is the highest. Classic examples of these are sickle cell disease and HIV with the highest incidence rates in Africa.

Moving the manufacturing of gene therapy products to local regions and point of care settings (within hospitals) are strategies that can both significantly reduce the cost of these products and improve accessibility. Additionally, current gene therapy approaches use expensive ex vivo procedures that require removal of a patients cells from their body. Instead, researchers must develop novel in vivo methods that simplify the procedure to a single injection directly into the patient, saving time and money.

Professor Julie Makani, Muhimbili University of Health and Allied Sciences, Tanzania

In order for gene therapy to have an impact on global health, changes in innovation and policy must occur at several levels: individual, institutional, national, continental and global.

At the individual level, patients and personnel are the primary focal points. Taking a patient-centered approach will ensure that the community is involved in research and will have a say in receiving a particular health intervention when it is available. For personnel working in areas pertinent to gene therapy including healthcare, research and education, there is a need to increase knowledge and to change perspectives regarding the advancements and achievements made within the field of gene therapy.

At the national, continental and global levels, genomic research is catalyzed by strategic partnerships and often occur in Centers of Excellence (CoE). Many countries in Africa have established CoEs in academic settings, which integrate health and science programmes. These innovative environments help maximize resources (physical and human) and provide settings that facilitate research and translation of research findings to health interventions to be done contemporaneously, in the appropriate population and geographical region.

At the policy-level, investments in global health and research in gene therapy must change. This can be done in three ways: direct investment to institutions in Africa; increase in the level of investment through funding partnerships; and recognition that the duration of investment needs to be longer than the normal funding cycles of three to five years.

Professor Suradej Hongeng, Mahidol University, Thailand

Gene therapy has received global attention over the last few years, recognition that continues to grow with each new clinical success. The field is constantly evolving, with disruptive innovation across public and private sectors. However, access to these life-saving treatments remain restricted due to a number of technical and policy challenges.

First, researchers must continue to develop cost-effective ways to administer gene therapies into patients, an area of R&D where the private sector can play an important role. Yet many LMICs have weak ecosystems to support the emergence of new companies or entice collaborations with multinational companies. Stronger private sector involvement will be critical for penetration into emerging markets.

Second, the unique nature of these personalized treatments makes them difficult to regulate within traditional frameworks, meaning that agencies must update current policies and regulations. As regulation evolves, it must also converge with the frameworks of other countries. This will make it easier for companies to navigate regulations and interact with agencies when performing clinical trials or bringing a therapy to multiple markets.

See the article here:
Gene therapy can make a real impact on global health but we need equitable access, say experts - World Economic Forum

Editas Rumored to be in Advanced Discussions around Potential Sale of Oncology Assets – BioSpace

From left: Editas CMO Baisong Mei and CEO Gilmore O'Neill/courtesy of Editas Medicine

CRISPR gene editing leader Editas Medicineoften makes biotech headlines for its therapies for sickle cell and retinal diseases. Less often does it make the news for its preclinical cancer pipeline which could be why the company is reportedly considering a sale of these assets.

Editas is in "advanced discussions" regarding the sale of its preclinical oncology lineup, according to reporting from Endpoints News. When asked to confirm the rumors, Cristi Barnett, VP & head of corporate communications at Editas told BioSpace,We have long shared our plans to pursue development and commercialization opportunities through partnerships, specifically with oncology and our iNK program.

Barnett added that with a new leadership team onboard, Editas undertook a strategic review to inform opportunities.

Investors seemed to agree with the notion as Editas stock rose4.2% following the report.

Editas has given its C-Suite a makeover this year. In April, the company appointed genetic medicine veteran Gilmore ONeill as president and CEO.

ONeill wasted no time in bringing on board Sanofi veteran Baisong Mei to serve as the companys new chief medical officer. Mei has deep experience in the hemophilia space at both Sanofi and Bayer. He replaced Lisa Michaels, who was terminated by the company in February.

Editas presented data on one of its oncology assets, EDIT-202, last week at the European Society of Gene and Cell Therapy 29th Annual Meeting in Edinburgh, Scotland. EDIT-202 is a gene-edited iPSC-derived NK cell therapy that maintains prolonged persistence, high cytotoxicity and enhanced in vivo control of solid tumors, according to Editas.

Currently, there is no change to our program or plans. EDIT-202is advancing toward IND-enabling studies, Barnett said. She added that Editas will share additional updates on this program later this year including additional preclinical data at an upcoming medical meeting.

Also at ESGCT, Editas presented preclinical data from another program, EDIT-103, which is being developed to treat rhodopsin-associated autosomal dominant retinitis pigmentosa (RHO-adRP), a progressive type of retinal degeneration.

In a non-human primate model, the therapy demonstrated nearly 100% knockout of the endogenous RHO gene. Additionally, the replacement RHO gene produced over 30% of normal RHO protein levels in the treated area of subretinal injection, the company reported.

Original post:
Editas Rumored to be in Advanced Discussions around Potential Sale of Oncology Assets - BioSpace

Editas Medicine Presents Preclinical Data on EDIT-103 for Rhodopsin-associated Autosomal Dominant Retinitis Pigmentosa at the European Society of Gene…

Studies in non-human primates demonstrated nearly 100% gene editing and knockout of endogenous RHO gene and more than 30% replacement protein levels using a dual vector AAV approach

Treated eyes showed morphological and functional photoreceptor preservation

EDIT-103 advancing towards IND-enabling studies

CAMBRIDGE, Mass., Oct. 13, 2022 (GLOBE NEWSWIRE) -- Editas Medicine, Inc. (Nasdaq: EDIT), a leading genome editing company, today announced ex vivo and in vivo preclinical data supporting its experimental medicine EDIT-103 for the treatment of rhodopsin-associated autosomal dominant retinitis pigmentosa (RHO-adRP). The Company reported these data in an oral presentation today at the European Society of Gene and Cell Therapy 29th Annual Meeting in Edinburgh, Scotland, UK.

EDIT-103 is a mutation-independent CRISPR/Cas9-based, dual AAV5 vectors knockout and replace (KO&R) therapy to treat RHO-adRP. This approach has the potential to treat any of over 150 dominant gain-of-function rhodopsin mutations that cause RHO-adRP with a one-time subretinal administration.

These promising preclinical data demonstrate the potential of EDIT-103 to efficiently remove the defective RHO gene responsible for RHO-adRP while replacing it with an RHO gene capable of producing sufficient levels of RHO to preserve photoreceptor structure and functions. The program is progressing towards the clinic, said Mark S. Shearman, Ph.D., Executive Vice President and Chief Scientific Officer, Editas Medicine. EDIT-103 uses a dual AAV gene editing approach, and also provides initial proof of concept for the treatment of other autosomal dominant disease indications where a gain of negative function needs to be corrected.

Key findings include:

Full details of the Editas Medicine presentations can be accessed in the Posters & Presentations section on the Companys website.

About EDIT-103EDIT-103 is a CRISPR/Cas9-based experimental medicine in preclinical development for the treatment of rhodopsin-associated autosomal dominant retinitis pigmentosa (RHO-adRP), a progressive form of retinal degeneration. EDIT-103 is administered via subretinal injection and uses two adeno-associated virus (AAV) vectors to knockout and replace mutations in the rhodopsin gene to preserve photoreceptor function. This approach can potentially address more than 150 gene mutations that cause RHO-adRP.

AboutEditas MedicineAs a leading genome editing company, Editas Medicine is focused on translating the power and potential of the CRISPR/Cas9 and CRISPR/Cas12a genome editing systems into a robust pipeline of treatments for people living with serious diseases around the world. Editas Medicine aims to discover, develop, manufacture, and commercialize transformative, durable, precision genomic medicines for a broad class of diseases. Editas Medicine is the exclusive licensee of Harvard and Broad Institutes Cas9 patent estates and Broad Institutes Cas12a patent estate for human medicines. For the latest information and scientific presentations, please visit http://www.editasmedicine.com.

Forward-Looking StatementsThis press release contains forward-looking statements and information within the meaning of The Private Securities Litigation Reform Act of 1995. The words "anticipate," "believe," "continue," "could," "estimate," "expect," "intend," "may," "plan," "potential," "predict," "project," "target," "should," "would," and similar expressions are intended to identify forward-looking statements, although not all forward-looking statements contain these identifying words. The Company may not actually achieve the plans, intentions, or expectations disclosed in these forward-looking statements, and you should not place undue reliance on these forward-looking statements. Actual results or events could differ materially from the plans, intentions and expectations disclosed in these forward-looking statements as a result of various factors, including: uncertainties inherent in the initiation and completion of preclinical studies and clinical trials and clinical development of the Companys product candidates; availability and timing of results from preclinical studies and clinical trials; whether interim results from a clinical trial will be predictive of the final results of the trial or the results of future trials; expectations for regulatory approvals to conduct trials or to market products and availability of funding sufficient for the Companys foreseeable and unforeseeable operating expenses and capital expenditure requirements. These and other risks are described in greater detail under the caption Risk Factors included in the Companys most recent Annual Report on Form 10-K, which is on file with theSecurities and Exchange Commission, as updated by the Companys subsequent filings with theSecurities and Exchange Commission, and in other filings that the Company may make with theSecurities and Exchange Commissionin the future. Any forward-looking statements contained in this press release speak only as of the date hereof, and the Company expressly disclaims any obligation to update any forward-looking statements, whether because of new information, future events or otherwise.

Read the original here:
Editas Medicine Presents Preclinical Data on EDIT-103 for Rhodopsin-associated Autosomal Dominant Retinitis Pigmentosa at the European Society of Gene...

Mathematical model could bring us closer to effective stem cell therapies – Michigan Medicine

Until recently, researchers could not see gene expression in an individual cell. Thanks to single cell sequencing techniques, they now can. But the timing of changes is still hard to visualize, as measuring the cell destroys it.

To address this, we developed an approach based on models in basic physics, explained Welch, treating the cells like they are masses moving through space and we are trying to estimate their velocity.

The model, dubbed MultiVelo, predicts the direction and speed of the molecular changes the cells are undergoing.

Like Podcasts? Add the Michigan Medicine News Break on Spotify, Apple Podcasts or anywhere you listen to podcasts.

Our model can tell us which things are changing firstepigenome or gene expression--and how long it takes for the first to ramp up the second, said Welch.

They were able to verify the method using four types of stem cells from the brain, blood and skin, and identified two ways in which the epigenome and transcriptome can be out of sync. The technique provides an additional, and critical, layer of insight to so called cellular atlases, which are being developed using single cell sequencing to visualize the various cell types and gene expression in different body systems.

By understanding the timing, Welch noted, researchers are closer to steering the development of stem cells for use as therapeutics.

One of the big problems in the field is the artificially differentiated cells created in the lab never quite make it to full replicas of their real-life counterparts, said Welch. I think the biggest potential for this model is better understanding what are the epigenetic barriers to fully converting the cells into whatever target you want them to be.

Additional authors on this paper include Chen Li, Maria C. Virgilio, and Kathleen L. Collins.

Paper cited: Single-cell multi-omic velocity infers dynamic and decoupled gene regulation, Nature Biotechnology. DOI: 10.1038/s41587-022-01476-y

Live your healthiest life: Get tips from top experts weekly. Subscribe to the Michigan Health blog newsletter

Headlines from the frontlines: The power of scientific discovery harnessed and delivered to your inbox every week. Subscribe to the Michigan Health Lab blog newsletter

More here:
Mathematical model could bring us closer to effective stem cell therapies - Michigan Medicine

Developing New Tools to Fight Cancer – Duke University School of Medicine

For decades, medical cancer treatment has generally meant chemotherapy, radiation, or surgery, alone or in combination. But things are changing rapidly. Today, new approaches such as immunotherapies and targeted therapies are becoming available, with many more in research and development. In many cases, the new treatments are more effective, with fewer side effects.

Its an exciting time to be in cancer research and cancer discovery, said Colin Duckett, PhD, professor of pathology, interim chair of the Department of Pharmacology and Cancer Biology, and vice dean for basic science."

Were moving into this era where we have a new set of tools we can use to treat cancer.-Colin Duckett, PhD

Researchers in the Duke Cancer Institute (DCI) and across the School of Medicine are helping to create these new tools, fueled by the knowledge and experience of experts from a wide range of disciplines.

Indeed, cancer research has always been a team-based endeavor at DCI.

DCI was specifically created a decade ago to break down barriers between disciplines to stimulate collaborative research and multidisciplinary interaction, said DCI Executive Director Michael Kastan, MD, PhD, the William and Jane Shingleton Distinguished Professor of Pharmacology and Cancer Biology.

Adding fuel to the fire is the Duke Science and Technology (DST) initiative, which aims to catalyze and support collaborative research in service of solving some of the worlds most pressing problems, including cancer.

The new tools, though varied, all represent advances in personalized cancer medicine. Targeted treatments are chosen based on the genetic signature of a patients tumor. Some immunotherapies take personalization even further, by manipulating a patients own immune cells to create a treatment for that individual alone.

To match treatments to patients, the multidisciplinary Duke Molecular Tumor Board, led by John Strickler, MD, HS11, and Matthew McKinney, MD06, HS06-09, HS10-13, helps providers identify best practices, newly approved treatments, or clinical trials for advanced cancer patients based on genetic sequencing of their tumors.

In precision cancer medicine the right therapy for the right patient at the right time all these things come together, the targeted therapies, the immunotherapy, even standard chemotherapy, all of that is part of precision cancer medicine.-Michael Kastan, MD, PhD

Immunotherapy aims to harness the power of the immune system to fight cancer. That can mean activating the immune system, energizing exhausted immune cells, or helping immune cells find cancer cells by guiding them there or by removing cancers good guy disguises.

Dukes Center for Cancer Immunotherapy supports these efforts by identifying promising basic science discoveries and building teams to translate those ideas into treatments.

"There are so many world-class basic research scientists here making discoveries..."-Scott Antonia, MD, PhD

...discoveries that are potentially translatable as immunotherapeutic strategies, said Scott Antonia, MD, PhD, professor of medicine and the centers founding director. Thats what motivated me to come to Duke, because of the great opportunity to interact with basic scientists to develop new immunotherapeutics and get them into the clinic.

Antonia believes immunotherapy has the potential to revolutionize cancer treatment, but more work remains to be done to realize its promise. The proof of principle is there, he said, but still only a relatively small fraction of people enjoy long-term survival. If we can hone immunotherapeutic approaches, thats our best opportunity.

Among the most exciting immunotherapy work being facilitated by the center involves removing a patients own T cells (a type of lymphocyte), manipulating them in the lab to make them more effective against tumors, then injecting them back into the patient.

T cells can be manipulated in the lab in a number of different ways. In one approach, called CAR T-cell therapy, the T cells are engineered with an addition of synthetic antibody fragments that bind to the patients tumor, effectively directing the T cells directly to the tumor cells.

In another approach, called tumor-infiltrating lymphocyte (TIL) adoptive cell therapy, the subset of a patients T cells that have already managed to find their way into the tumor are extracted and then grown to large numbers before being returned to the patient. Antonia and his colleagues recently published a paper demonstrating the effectiveness of TIL expansion in lung cancer. Were now doing the preparative work to develop clinical trials using this approach in brain tumors, and our intention is to expand into many other cancers as well, he said.

Antonia points out that innovations in CAR T-cell therapy and TIL therapy happening at Duke are possible because of collaborations with scientists in an array of disciplines, including antibody experts like Barton Haynes, MD, HS73-75, the Frederic M. Hanes Professor of Medicine, and Wilton Williams, PhD, associate professor of medicine and surgery, at the Duke Human Vaccine Institute, and biomedical engineers like Charles Gersbach, PhD, the John W. Strohbehn Distinguished Professor of Biomedical Engineering at the Pratt School of Engineering.

Furthermore, clinical trials for these kinds of cellular therapies require special facilities to engineer or expand the cells, which are provided by Dukes Marcus Center for Cellular Cures, led by Joanne Kurtzberg, MD, the Jerome S. Harris Distinguished Professor of Pediatrics, and Beth Shaz, MD, MBA, professor of pathology. Its been a very productive collaboration highlighting how Duke is uniquely positioned to develop immunotherapeutic strategies, Antonia said.

Targeted therapies exploit a tumors weak spot: a genetic mutation, for example. The benefit is that the treatment kills only cancer cells and not healthy cells. The prerequisite is knowing the genetics and biology of the specific tumor, no simple task.

Trudy Oliver, PhD05, who joined the Department of Pharmacology and Cancer Biology faculty as a Duke Science and Technology Scholar, studies cancer development and the biology of tumor subtypes, particularly squamous cell lung cancer and small cell lung cancer.

Even within small cell lung cancer, there are subsets that behave differently from each other, she said. Some of the treatments shes identified are in clinical trials

Our work suggests that when you tailor therapy to those subsets, you can make a difference in outcome.-Trudy Oliver, PhD'05

Some of the treatments shes identified are in clinical trials.

Sandeep Dave, MD, Wellcome Distinguished Professor of Medicine, is leading an ambitious project to analyze the genomics of the more than 100 different types of blood cancer. His project will streamline the diagnosis of blood cancer and uncover potential therapy targets.

All cancers arise from genetic alterations that allow cancer to survive and thrive at the expense of the host, he said. These genetic alterations are a double-edged sword they allow these cancer cells to grow, but on the other hand they do confer specific vulnerabilities that we can potentially exploit.

Dave said his background in computer science, genetics, and oncology helped him as he designed the project, which uses huge datasets.

Weve done the heavy lifting in terms of tool development and methodology, which is ripe to be applied to every other type of cancer."-Sandeep Dave, MD

Cancer disparities are caused by a complex interplay of elements, including access to health care and other resources, institutional barriers, structural racism, and biology, such as ancestry-related genetics. For example, some genetic biological factors and social elements contribute to disparities in many types of cancer.

Cancer treatment is approaching this personalized space where patients are no longer treated with a one-size-fits-all paradigm."-Tammara Watts, MD, PhD

"Its becoming increasingly apparent that there are differences in outcome with respect to race and ethnicity, said Tammara Watts, MD, PhD, associate professor of head and neck surgery & communication sciences, and associate director of equity, diversity, and inclusion at DCI. The very broad hypothesis is that there are genetic ancestry-related changes that may play a critical role in the disparate clinical outcomes we see every day in our cancer patients.

For example, self-identified white patients with throat cancer associated with the human papilloma virus (HPV) have better outcomes compared to self-identified Black patients, even when controlling for elements such as health care access, education, and socioeconomic status.

Watts is collaborating with bioinformatics experts at DCI to try to identify significant differences in gene expression among the two groups.

Im trying to tease out differences that may be impactful for disadvantaged patients based on race and ethnicity, she said. But there could be differences that emerge that could be useful for designing targeted treatments for a broad group of patients.

Thats because a targeted treatment for a particular genetic expression that might occur more commonly in Black people would help all patients with that expression, regardless of race or ethnicity.

Watts is far from alone in doing cancer disparity research at DCI. Tomi Akinyemiju, PhD, associate professor in population health sciences, uses epidemiology to study both biological factors and social elements that contribute to disparities in many types of cancer.

Jennifer Freedman, PhD, associate professor of medicine, Daniel George, MD92, professor of medicine, and Steven Patierno, PhD, professor of medicine and deputy director of DCI, are studying the molecular basis for why prostate, breast, and lung cancer tend to be more aggressive and lethal in patients who self-identify as Black. Patierno, who has been a national leader in cancer disparities research for more than 20 years, leads the Duke Cancer Disparities SPORE (Specialized Program of Research Excellence), funded by the National Cancer Institute. The SPORE grant supports these researchers as well as other DCI teams working on cancers of the breast, lung, stomach, and head and neck.

One of the things that impresses me is that [cancer disparities research] is a high priority within DCI, said Watts, who joined the faculty in 2019. These groups are actively engaged and collaborating and asking the questions that will drive change for patients who have worse outcomes that are related to ancestry.

Even better than a cancer cure is avoiding cancer altogether.

At DCI, Meira Epplein, PhD, associate professor in population health sciences, and Katherine Garman, MD02, MHS02, HS02-06, HS09, associate professor of medicine, are looking to decrease the incidence of stomach cancer by improving detection and treatment of the bacteria Helicobacter pylori, which can set off a cascade leading to stomach cancer. Epplein and Garman, also funded by the Duke Cancer Disparities SPORE grant, hope their work will reduce disparities because H. pylori infections and stomach cancer are both more prevalent among African Americans than whites.

When preventing cancer isnt successful, the next best thing is to detect and treat early. A relatively new concept in cancer care is interception, which means catching cancer just as, or even just before, it begins.

The point is to prevent it from progressing to full blown malignancy, said Patierno. In other words, stop the cancer from getting over its own goal line.

Patierno envisions a future where patients with pre-cancerous conditions or early cancer could take a pill to halt cancer development without killing cells in other words, a non-cytotoxic treatment, unlike standard chemotherapy.

We know its there, but were not going to poison it or burn it or cut it out because all of those have side effects. Were going to find a non-cytotoxic way to prevent it from progressing. Thats the goal.-Steven Patierno, PhD

Read About Alumni Making a Differencein Cancer Research and Care:

Changing theStatus Quo: Lori Pierce MD'85

Treatingthe WholePerson:Arif Kamal, MD,HS12, MHS15

Targetingthe Seeds ofCancer Growth:Eugenie S. Kleinerman, MD75, HS75

A DiscoveryThat Comes Outof Nowhere:Bill Kaelin, BS79, MD82

Story originally published in DukeMed Alumni News, Fall 2022.

Read more from DukeMed Alumni News

More here:
Developing New Tools to Fight Cancer - Duke University School of Medicine

CANbridge-UMass Chan Medical School Gene Therapy Research in Oral Presentation at the European Society of Gene and Cell Therapy (ESGCT) 29th Annual…

BEIJING & BURLINGTON, Mass.--(BUSINESS WIRE)--CANbridge Pharmaceuticals Inc. (HKEX:1228), a leading global biopharmaceutical company, with a foundation in China, committed to the research, development and commercialization of transformative rare disease and rare oncology therapies, announced that data from its gene therapy research agreement with the Horae Gene Therapy Center, at the UMass Chan Medical School, was presented at the 29th European Society of Gene and Cell Therapy Annual Congress in Edinburgh, Scotland, today.

In an oral presentation, Guangping Gao, Ph.D., Co-Director, Li Weibo Institute for Rare Diseases Research, Director, the Horae Gene Therapy Center and Viral Vector Core, Professor of Microbiology and Physiological Systems and Penelope Booth Rockwell Professor in Biomedical Research at UMass Chan Medical School, discussed the study that was led by the investigator Jun Xie, Ph.D., and his team from Dr. Gaos lab, and titled Endogenous human SMN1 promoter-driven gene replacement improves the efficacy and safety of AAV9-mediated gene therapy for spinal muscular atrophy (SMA) in mice.

The study showed that a novel second-generation self-complementary AAV9 gene therapy, expressing a codon-optimized human SMN1 gene. under the control of its endogenous promoter, (scAAV9-SMN1p-co-hSMN1), demonstrated superior safety, potency, and efficacy across several endpoints in an SMA mouse model, when compared to the benchmark vector, scAAV9-CMVen/CB-hSMN1, which is similar to the vector used in the gene therapy approved by the US Food and Drug Administration for the treatment of SMA. The benchmark vector expresses a human SMN1 transgene under a cytomegalovirus enhancer/chicken -actin promoter for ubiquitous expression in all cell types, whereas the second-generation vector utilizes the endogenous SMN1 promoter to control gene expression in different tissues. Compared to the benchmark vector, the second-generation vector resulted in a longer lifespan, better restoration of muscle function, and more complete neuromuscular junction innervation, without the liver toxicity seen with the benchmark vector.

This, the first data to be presented from the gene therapy research collaboration between CANbridge and the Gao Lab at the Horae Gene Therapy Center, was also presented at the American Society for Cellular and Gene Therapy (ASGCT) Annual Meeting in May 2022. Dr. Gao is a former ASCGT president.

Oral Presentation: Poster #: 0R57

Category: AAV next generation vectors

Presentation Date and Time: Thursday, October 13, 5:00 PM BST

Authors: Qing Xie, Hong Ma, Xiupeng Chen, Yunxiang Zhu, Yijie Ma, Leila Jalinous, Qin Su, Phillip Tai, Guangping Gao, Jun Xie

Abstracts are available on the ESGCT website: https://www.esgctcongress.com/

About the Horae Gene Therapy Center at UMass Chan Medical School

The faculty of the Horae Gene Therapy Center is dedicated to developing therapeutic approaches for rare inherited disease for which there is no cure. We utilize state of the art technologies to either genetically modulate mutated genes that produce disease-causing proteins or introduce a healthy copy of a gene if the mutation results in a non-functional protein. The Horae Gene Therapy Center faculty is interdisciplinary, including members from the departments of Pediatrics, Microbiology & Physiological Systems, Biochemistry & Molecular Pharmacology, Neurology, Medicine and Ophthalmology. Physicians and PhDs work together to address the medical needs of rare diseases, such as alpha 1-antitrypsin deficiency, Canavan disease, Tay-Sachs and Sandhoff diseases, retinitis pigmentosa, cystic fibrosis, amyotrophic lateral sclerosis, TNNT1 nemaline myopathy, Rett syndrome, NGLY1 deficiency, Pitt-Hopkins syndrome, maple syrup urine disease, sialidosis, GM3 synthase deficiency, Huntington disease, and others. More common diseases such as cardiac arrhythmia and hypercholesterolemia are also being investigated. The hope is to treat a wide spectrum of diseases by various gene therapeutic approaches. Additionally, the University of Massachusetts Chan Medical School conducts clinical trials on site and some of these trials are conducted by the investigators at The Horae Gene Therapy Center.

About CANbridge Pharmaceuticals Inc.

CANbridge Pharmaceuticals Inc. (HKEX:1228) is a global biopharmaceutical company, with a foundation in China, committed to the research, development and commercialization of transformative therapies for rare disease and rare oncology. CANbridge has a differentiated drug portfolio, with three approved drugs and a pipeline of 11 assets, targeting prevalent rare disease and rare oncology indications that have unmet needs and significant market potential. These include Hunter syndrome and other lysosomal storage disorders, complement-mediated disorders, hemophilia A, metabolic disorders, rare cholestatic liver diseases and neuromuscular diseases, as well as glioblastoma multiforme. CANbridge is also building next-generation gene therapy development capability through a combination of collaboration with world-leading researchers and biotech companies and internal capacity. CANbridges global partners include Apogenix, GC Pharma, Mirum, Wuxi Biologics, Privus, the UMass Chan Medical School and LogicBio.

For more on CANbridge Pharmaceuticals Inc., please go to: http://www.canbridgepharma.com.

Forward-Looking Statements

The forward-looking statements made in this article relate only to the events or information as of the date on which the statements are made in this article. Except as required by law, we undertake no obligation to update or revise publicly any forward-looking statements, whether as a result of new information, future events or otherwise, after the data on which the statements are made or to reflect the occurrence of unanticipated events. You should read this article completely and with the understanding that our actual future results or performance may be materially different from what we expect. In this article, statements of, or references to, our intentions or those of any of our Directors or our Company are made as of the date of this article. Any of these intentions may alter in light of future development.

Here is the original post:
CANbridge-UMass Chan Medical School Gene Therapy Research in Oral Presentation at the European Society of Gene and Cell Therapy (ESGCT) 29th Annual...

Winners of ninth annual Vision Research Workshop named – Wayne State University

The poster and oral presentation winners of the Wayne State University School of Medicines ninth annual Vision Research Workshop have been announced.

The workshop, held Oct. 12, was presented by the Department of Ophthalmology, Visual and Anatomical Sciences, and the Kresge Eye Institute.

Presentation winners included:

Poster Presentations

First place: Nicholas Pryde, Assessment of NanodropperTM eyedropper attachment

Second place: Bing Ross, Mechanism of Preferential Calcification in Hydrophilic Versus Hydrophobic Acrylic Intraocular Lens

Third place: Pratima Suvas, Expression, Localization, and Characterization of CXCR4 and its ligand CXCL12 in herpes simplex virus-1 infected corneas

Oral Presentations

First place: Ashley Kramer, A comparative analysis of gene and protein expression in a zebrafish model of chronic photoreceptor degeneration

Second place: Jeremy Bohl, Long-distance cholinergic signaling contributes to direction selectivity in the mouse retina

Third place: Zain Hussain, Diagnostic and Treatment Patterns of Age-Related Macular Degeneration among Asian Medicare Beneficiaries

Mark Juzych, M.D., chair of the Department of Ophthalmology, Visual and Anatomical Sciences, and director of the Kresge Eye Institute, gave welcome remarks.Linda Hazlett, Ph.D., vice dean of Research and Graduate Programs and vice chair of the department, provided an overview of research.

The keynote speaker giving the annual Robert N. Frank, M.D., Clinical Translational Lecture, was Reza Dana, M.D., M.P.H., the Claes H. Dohlman Chair and vice chair for Academic Programs in Ophthalmology at Harvard Medical School, who presented New Ways of Doing Old Things: Translational Investigations in Management of Common Corneal and Ocular Surface Disorders.

See the original post:
Winners of ninth annual Vision Research Workshop named - Wayne State University