Monthly Archives: September 2020

Walk-through cancer diagnoses and robotics muscles among groundbreaking projects backed by government – GOV.UK

Posted: September 9, 2020 at 11:20 am

Debilitating diseases such as cancer and osteoarthritis could be identified and treated faster and more effectively, thanks to 1 of 6 projects benefiting from 32 million government funding.

As part of a keynote speech on research and development at London Tech Week 2020, the Science Minister Amanda Solloway will today (Monday 7 September) announce 6 new projects aimed at developing revolutionary new technological approaches that aim to transform care and treatments in the NHS by 2050, helping to improve peoples quality of life as they age.

InlightenUs, led by the University of Edinburgh, will receive 5.4 million to use a combination of artificial intelligence (AI) and infra-red lasers to produce fast, high resolution 3D medical images, helping to identify diseases in patients more quickly.

Working with the universities of Nottingham and Southampton, the new research will initially be developed for use on hospital wards and GP surgeries, and by 2050 aims to scale up to walk through airport style X-Ray scanners, which will be able to pick up detailed images of structures often hidden within the human body that can reveal tumours.

Another of the 6 projects, emPOWER, will be led by researchers at the University of Bristol, and will receive 6 million to develop artificial robotic muscular assistance to help restore strength in people who have lost muscle capability. This could include patients who have suffered a stroke or are living with degenerative diseases such as sarcopenia and muscular dystrophy.

Using these highly targeted robotics will help overcome the limitations of current wearable assistive technology of regenerative medicine. Often, these technologies can be bulky and uncomfortable to wear, and can require 2 people to put on and take off. Users can also find the movements too slow. Through using robots, emPOWER will provide life changing benefits for sufferers, restoring their confidence, independence and quality of life, all while reducing the cost to the NHS.

Ahead of her keynote speech on R&D at London Tech Week, Science Minister Amanda Solloway said:

The pioneering projects we are backing today will help modernise healthcare, improving all of our lives now and into the future.

Todays announcement is part of our ambitious R&D Roadmap and underlines our commitment to back our incredible scientists and researchers and invest in ground-breaking research to keep the UK ahead in cutting-edge discoveries.

The funding is being delivered through the Engineering and Physical Sciences Research Council (EPSRC), part of UK Research and Innovation, through the Transformative Healthcare Technologies for 2050 call.

As part of her speech, Minister Solloway will set out the governments ambitions for research to address significant issues such as advancing healthcare outcomes for patients and ensuring the UK is at the forefront of transformational technologies like artificial intelligence.

It follows the launch of the governments R&D Roadmap in July 2020 which detailed plans to make the UK the best place in the world for scientists and researchers to live and work, building on the governments commitment to increase R&D public spending to 22 billion per year by 2024 to 2025.

Innovation minister Lord Bethell said:

Throughout this global pandemic, the NHS has continued to be there for us all and to treat cancer patients and those living with chronic illness as a priority.

These pioneering new projects will help us further improve care for patients and make life easier for NHS staff, cementing the UKs status as a world leader in research and technology and ultimately saving thousands of lives.

EPSRC Executive Chair, Processor Dame Lynn Gladden, said:

The projects announced today will develop new approaches which could become routine in the NHS and community and home care in the coming decades.

Harnessing the latest technologies and the UKs world-leading expertise will allow us to deliver a step-change in how healthcare is delivered and benefit millions of people, emphasising the critical role the UKs R&D sector plays in improving the health of the nation.

Led by Imperial College London, it will receive 5.5 million to develop a Non-Invasive Single Neuron Electrical Monitoring technology, which when combined with AI will allow researchers to monitor the brain in a way never achieved before. This will help scientists gain a better understanding of neurological diseases such as Parkinsons and Alzheimers. Currently approaches to monitoring the brain are invasive and so this new method would enable new pharmacological and neurotechnology-based treatments to be developed which are far more effective than any current treatments.

Led by Edinburgh Napier University, it will receive 3.2 million, to develop hearing aids designed to autonomously adapt to the nature and quality of their surroundings. Currently only 40% of people who could benefit from hearing aids have them, while most current devices make only limited use of speech enhancement. These hearing aids would be able to adapt to the nature and quality of the visual and acoustic environment around them, resulting in greater intelligibility of noise and potentially reduced listening effort for the listener.

Led by the University of Glasgow, it will receive 5.5 million to develop a project which aims to create a home of the future, providing homeowners with feedback on their health and wellbeing. Bringing clinically approved sensors into the living environment will enable individuals, carers or healthcare professional to monitor blood flow, heart rate and even brain function, in the home. Monitoring physical and emotional well-being in the home will enable tailored programmes to be built for lifestyles improvement, as well as rehabilitation.

Led by Heriot-Watt University, in partnership with the universities of Bath and Edinburgh, it will receive 6.1 million to exploit new laser, optical fibre and imaging technologies, delivering therapy for bacterial diseases and viruses in confined regions of the body such as the lungs, catheters inserted into the body for prolonged periods and areas of the body that have been subject to surgical procedures. The platform will be able to cut out single cells leaving the cells around it undamaged in cancer surgery, aiming to offer a cure for currently unresectable tumours tumours that are too close to critical structures and cannot be cut away safely with current approaches.

The Engineering and Physical Sciences Research Council (EPSRC), in collaboration with the Medical Research Council, will shortly be inviting proposals for adventurous projects as part of the second phase of the Transformative Healthcare Technologies for 2050 call.

This call will target projects that that are guided by a longer-term vision to pursue new, high risk high reward ideas and develop thinking and approaches supported by the next generation of underpinning science, engineering and emerging technologies in the healthcare space. We seek and encourage adventurous ideas, new thinking and collaborations that have the potential to significantly improve healthcare delivery by 2050. Read further details.

Follow this link:

Walk-through cancer diagnoses and robotics muscles among groundbreaking projects backed by government - GOV.UK

Posted in Robotics | Comments Off on Walk-through cancer diagnoses and robotics muscles among groundbreaking projects backed by government – GOV.UK

Ahead of Its Game, the US Army Developed Six-Legged Walking Robots in the 1980s – Interesting Engineering

Posted: at 11:20 am

The 1980s wasn't just the era of synthpop, bright clothing, and huge hairdos. It was also a time when the U.S. Army was thinking ahead of its time to develop new technologies to carry its troops to the next step, quite literally.

And so it worked on the development of a massive six-legged hydraulic robot truck that was operated by one person. It was called the Adaptive Suspension Vehicle(ASV) and it looked something out of the Star Wars movies.

The Army worked alongside Ohio State University (OSU) researchers to create the vehicle, along with a number of outside contractors.

The Drive published an extensive report on the vehicle.

SEE ALSO: U.S. NAVY MIGHT HAVE ROBO-SHIPS WAY TOO SIMILAR TO STAR DESTROYERS

The ASV was impressive in size and automation for its time, unfortunately, it was also very slow and couldn't carry a big payload. That said, it's still a rather impressive piece of engineering and robotics.

The project kicked off in 1981 and was led by Robert McGhee and Kenneth Waldron from OSU, and was developed over nine years, per the Drive's report.

At the time, it took 17 OSU computers to run the behemoth robot and ensure its operator wasn't exhausted from conducting six separate robot legs by the end of the day. The computers managed a number of tasks, such as cathode ray tube (CRT) displays in the cockpit, choosing the best footing, and analyzing the data brought together by the six feet.

All of the collected data was then processed by an operating software, which was written in Pascal and created 150,000 lines of source code.

The driver used a keypad and a joystick to select the vehicle's direction. As per the original article covering the ASV's capabilities, the end goal was to make it drive autonomously, however, that day never arrived.

The ASV was able to move thanks to a 900cc motorcycle engine placed in the center of the machine, offering 91 horsepower at its peak. There were a whopping 18 variable displacement pumps that were driven by a complex operating system.

The vehicle could move at 8 mph (13 km/h), and even though it was moving slowly, it wasn't a smooth ride. As per the original OSU article, the regular cruising speed was closer to 4 mph (6.4 km/h).

What was also cool was that it boasted six drive modes: utility, precision footing, close maneuvering, follow the leader, terrain following, and cruise.

It weighed 5,952 pounds (2,700 kg), and could only carry 485 pounds (220 kg) worth of payloads. It was 17 feet (5 meters) long, 7.9 feet (2.4 meters) wide, and went up 9.8 feet (2.9 meters). A pretty big truck unable to carry much payload or many people.

It could, however, walk over obstacles up to 6.9 feet (2.1 meters) tall, and stretch over trenches as wide as 23 feet (7 meters).Regardless of some of its impressive features, especially its time, the project was stopped in 1990, and the ASV has been lost out of sight.

Instead, the U.S. Armyand DARPA have been working on some other interesting projects.

Go here to see the original:

Ahead of Its Game, the US Army Developed Six-Legged Walking Robots in the 1980s - Interesting Engineering

Posted in Robotics | Comments Off on Ahead of Its Game, the US Army Developed Six-Legged Walking Robots in the 1980s – Interesting Engineering

How Does GMO Affect the Insects Around Us? – iharare.com

Posted: September 7, 2020 at 2:28 am

GMO has been around for a while now, and it has continued to be the rave. Scientists have long learned to alter the genes of plants to favour the development of some traits in these plants.

When you hear about GMOs for the first time, it is possible that you have questions concerning them. What exactly is GMO? How does it affect the surrounding plants?

Are there plants that are resistant to its effects? Are there any negative effects of growing genetically modified crops? What are the short-term and long-term effects on insects, biodiversity, and the ecological environment?

The answers to these questions would become evident to you as you read on.

GMO stands for Genetically Modified Organisms. GMOs are organisms whose genes and genetic behaviors have been altered with the help of genetic engineering.

In case you were wondering what genetic engineering is, it simply involves modifying the phenotype of an organism by reconstructing its genetic makeup. Most of the time, GMOs come about through mating or the recombination of genes.

Genetic engineering has since pervaded agriculture with crops being genetically modified. The aim of genetic modification of crops is to make the plants more resistant to diseases, pests, chemical treatments, or environmental conditions.

Some other crops are genetically modified to make them more nutritious. A common example of a crop that was genetically modified to increase its nutritional value is golden rice.

The need for genetically modified crops soared when pesticides were deemed incapable to adequately control yield-diminishing pests. Not all insects on a farm are villains.

Some actually have their advantages. The majority of them, however, never have good plans for crops. When you use pesticides on these crops, they kill all insects, including the good and bad ones. Usually, this is not what farmers want.

They want the good insects to remain and the bad ones to go. Also, spraying crops with pesticides could have negative effects on the crops themselves. With these major disadvantages of pesticides, the need for something better increased. Something that would kill the pests but not the good insects. That is how the need for genetically modified crops arose.

It used to be that the genetically modified crops had their protein manufacturing system modified so that a kind of protein that was previously absent in the crops were added. This protein is one that is harmful to select insects.

The genetic engineering of these plants is such that when pests eat crops that contain this harmful protein, it ruptures their stomach and kills them. And while these crops are terminal to pests, they dont affect insects and have some other advantages too.

However, this use of genetically modified crops came under a heavy debate when questions on the effects of these crops on humans popped up.

The solution to that came in the form of RNA modified crops. The genetic modification of crops is such that the crops can produce RNA fragments. These RNA fragments get into pests and target the genes responsible for reproduction or for life in them.

The result is that the pests are either killed or rendered incapable of reproducing. Ralph Bock, a director at a renowned institution in Germany, Max Planck Institute of Molecular Plant Physiology, had something to say about this method.

He said, RNA interference-based pest control can provide protection at essentially no cost because once the variety is developed, the plant can just go on using it instead of needing additional applications of insecticide.

This way, controlling pest infestation becomes a lot more effective.

GMO have delivered the desired effects on both insects and critters. The ones we dont want feeding on our crops are either being killed or being rendered impotent by the genetically modified crops.

Also, genetically modified crops dont affect other insects, animals, and organisms whose presence on our farms is not detrimental. And when humans consume them, they do not exhibit negative developments because of them.

However, this is actively affecting biodiversity and in no time, some insects might go extinct. Given that the RNA fragments target the genes responsible for reproduction or life, before the insects that are responsible for affecting the crops die off, they are unable to reproduce and bring forth another generation.

This also affects the food chain. Some insects depend on feeding on a particular species of insects to survive. If these insects are no longer available or are very scarce, they would have to go some lengths to find food and in due time, they would begin to die off.

Most people might think that these effects are insignificant, however, it could have a huge impact on the ecological environment in general.

The onus is therefore on scientists to discover other ways of pest control that does not involve wiping out an entire species.

Also, rendering the insects impotent is a clear call for extinction. Instead, the spread of these insects can be controlled or the plants can be modified to not seem attractive to the insects anymore.

The effects of GMO can be pretty harsh and they bear long-term effects that most people havent considered and thought through. For instance, some of the good insects on farms are there to consume undesirable ones, and their effect may be two-fold. They prevent the undesirables from destroying the plants and they may aid pollination or some other essential process as well.

These are essential life processes that the use of GMOs may be altering, and the bottom line, maybe GMOs arent as great as most people think.

IMPORTANT NOTICE:Access LIVE News, Jobs and Exchange rates directly on WhatsApp. Send a WhatsApp message with the word HIE to +263718636522 or just CLICK HERE to try it out. iHarare is now a information one-stop-shop. Please click on the links highlighted in Red to accessNews, Events, Jobs, Pricecheck, Classifieds, Dating and Education services by iHarare.For any queries contact us by CLICKING HERE .

Like Loading...

Link:
How Does GMO Affect the Insects Around Us? - iharare.com

Posted in Genetic Engineering | Comments Off on How Does GMO Affect the Insects Around Us? – iharare.com

How to fight the deadly dengue virus? Make your own mosquitoes – Livemint

Posted: at 2:28 am

Releasing mosquitoes into the corridors of apartment complexes might seem like an unusual strategy for a city fighting its worst recorded outbreak of dengue, a painful disease spread between humans by mosquitoes. But the thousands of little insects discharged last week werent your average mosquitoes.

They were bred in a laboratory to carry a substance not commonly found in this type of mosquito: bacteria called Wolbachia. When the bacteria-laden male mosquitoes are released into the open and mate with naturally-born females, the resultant eggs wont hatch.

The outcome is reduced number of dengue cases in the areas where the lab-bred insects were released, according to Singapores government.

Scientists and governments are expanding high-tech solutions like these as the threat from the dengue virus grows. Some are using genetically engineered mosquitoes; others are zapping them with X-ray beams to sterilize them.

The World Health Organization says roughly half the worlds population is at risk of catching dengue, a viral infection that causes an intense flulike illness that is sometimes lethal. Growing urbanization and bulging cities have given mosquitoes vast human populations to feast on. Reported cases of the disease increased from about 500,000 in 2000 to 4.2 million in 2019, with tropical countries such as Brazil, Indonesia and the Philippines especially hard-hit.

Global warming could spread the disease further as both dengue-carrying mosquitoes and the virus itself thrive in warmer climates.

Dengue is transmitted by the female Aedes aegypti mosquito, which also spreads other diseases like Zika, which can cause severe birth defects when pregnant women are infected, and chikungunya, which causes fever and joint pain. Public-health campaigns have traditionally focused on simple solutions, such as encouraging people to empty stagnant water from household objects such as vases, pails and watering cans, where mosquitoes lay eggs. Insecticides are also used in dengue-prone areas.

But mosquitoes have developed immunity against common insecticides and dengue cases are rising globally. That is why scientists turned to altering or modifying the mosquitoes themselves.

In Singaporewhich has long suffered from dengue outbreaksspecialized mosquito-breeding began with mosquito eggs shipped from Michigan. A team led by Zhiyong Xi, a professor at Michigan State Universitys Department of Microbiology and Molecular Genetics used long, thin glass needles to inject Wolbachia into mosquito eggs, resembling tiny grains of dirt, that had been laid 90 minutes before. Upon hatching, the larvae also contained the bacteria.

That first generation passed the Wolbachia bacteria on to its descendants, birthing a new line of bacteria-infused mosquitoes whose eggs were shipped to Singapore to found the city-states colony.

Before the offspring could be released, the females needed to be separated from the males, which dont bite or transmit the dengue virus. Sex-sorting is critical because Singapores program hinges on mating males that contain the bacteria with females that dont. If both sexes carried the bacteria, the mosquitoes would successfully procreate, thwarting the programs goal of reducing the local mosquito population.

A machine developed by Verily, an Alphabet Inc. company focused on life sciences, uses automated mechanical sieves to separate female mosquito pupaewhich are generally largerfrom male ones. This step removes about 95% of females, the company says.

A computer vision system is used to identify any females the sieve may have missed. The system looks for the females distinct proboscis or mouth, antenna and other anatomical clues, flagging it for removal. Verily says substantially fewer than one in a million mosquitoes it releases is female, keeping Wolbachia from being inherited in the wild mosquito population.

Not all Wolbachia mosquitoes released in Singapore are sieved through Alphabets machine. Others are subjected to low-dose X-ray irradiation using a specific methodology Singapore developed in collaboration with the International Atomic Energy Agency. The irradiation sterilizes female mosquitoes, so that any that are inadvertently released will be unable to reproduce and spread Wolbachia to future generations.

Singapores government says that in parts of the city where its males have been released there were 65% to 80% fewer dengue cases compared with areas where the mosquitoes werent released. Mosquitoes are now being discharged in 5% of the citys public housing blocks. The releases are slated to expand to 15% of them by 2022.

Other programs want the Wolbachia to be inherited widely in wild populations. That is because those programs have found that the bacteria has another feature: It strongly reduces the Aedes aegypti mosquitoes ability to transmit dengue to humans.

The World Mosquito Program, a nonprofit active in a dozen countries in Asia, the Pacific and Latin America, released lab-bred bacteria-containing mosquitoesboth male and femalein the city of Yogyakarta in Indonesia. It counted on the fact that female mosquitoes will produce offspring that also have the bacteria, meaning the dengue-blocking feature is passed down.

Its trial showed a 77% reduction in dengue cases in areas where the mosquitoes were released compared with areas where they werent, the nonprofit said in August.

This method is much simpler than Singapores technique, which involves complex sex-sorting. But some scientists say releasing females with Wolbachia is potentially irreversible. If the Wolbachia turns out to have unintended consequences, it would be very difficult to extract the bacteria from the mosquito population, they say.

One laboratory study found that carrying Wolbachia enhanced the infection rate of West Nile virus in the Culex tarsalis species of mosquito, which is endemic to North America. Its a big black box," said Jason Rasgon, professor of disease epidemiology at Pennsylvania State University, arguing more research should be done on Wolbachias effects on the transmission of other diseases before further large-scale releases.

Cameron Simmons, a director at the World Mosquito Program, said many governments have conducted risk-assessments of its approach. On balance Wolbachia represented a negligible risk compared to doing nothing," he said.

One company is going in a different direction altogether: genetic engineering. Oxitec, a U.S.-owned biotechnology company with research bases in the U.K. and Brazil inserts a new gene in eggs that makes female mosquitoes die shortly after hatching while they are still in the larval stage of development.

Last year, Oxitec conducted a trial of its latest gene-modified version, which it calls OX5034, in Indaiatuba, Brazil, near So Paulo. For the trial, the company produced OX5034 eggs at a factory in Brazil and distributed them at release points around the municipality. When the eggs hatched, the females died before they could become adults capable of flying and biting.

The males, which reached adulthood, mated with local wild females, passing along the female-killing genes, reducing Aedes aegypti mosquito numbers by about 95%, Oxitec said.

The company received U.S. federal approvals in May for pilot releases in Florida, which the company expects to begin next year.

Oxitec says the genes they have added are self-limiting, which means that after a few generationsabout three to four monthsthe female-targeting gene is bred out of the species. Municipalities that wish to continue with the approach would carry on releasing OX5034 eggs to keep the mosquito population in check, it said, and those that dont would still have an off-ramp.

Jeffrey Powell, a biology professor at Yale University, sees drawbacks to the gene-modification approach. He said the need for periodic rereleases would get expensive, and over time wild mosquitoes may adapt to avoid mating with Oxitecs genetically doomed males. There is no evidence it is doing anything bad," he said of the genes Oxitec has introduced into mosquitoes. Its a complete unknown." He said he felt more comfortable with the use of Wolbachia, which is found naturally in many mosquito species.

Oxitec says it has released about one billion mosquitoes in the past decade and has no evidence female mosquitoes selectively mate with non-Oxitec males.

Theres no ecological footprint; theres no persistence," said Kevin Gorman, who heads field operations for Oxitec. Its not going to permanently change the environment at all."

Write to Jon Emont at jonathan.emont@wsj.com

Subscribe to newsletters

* Enter a valid email

* Thank you for subscribing to our newsletter.

See original here:
How to fight the deadly dengue virus? Make your own mosquitoes - Livemint

Posted in Genetic Engineering | Comments Off on How to fight the deadly dengue virus? Make your own mosquitoes – Livemint

In the precision medicine era, the line between products and services is blurred – PMLiVE

Posted: at 2:28 am

Precision and personalised medicines are more than products, they are services in their own right. So, how should pharma approach this uncharted territory to ensure targeted therapies work for patients?

Personalised and precision medicines are exciting fields that focus on the development of treatment and prevention strategies for a single patient or patient group. The treatments are developed using cutting- edge technologies such as genomic sequencing and genetic engineering, helping to account for the individual variability in both patient and disease characteristics.

This has gained a lot of attention in recent years due to revolutionary breakthroughs in debilitating chronic diseases such as cancer. Traditionally, cancer patients are treated using one size fits all interventions like chemotherapy, radiotherapy and surgery. These vary in their effectiveness and result in damage to healthy tissues.

Personalised and precision medicine, however, can offer specialised treatments that target the patients unique cancer subtype, its genetic mutations, and the affected tissues.

These therapies involve novel pathways and complex processes to aid and deliver treatment, making each therapy a service in its own right. They depend on many touchpoints, stakeholders, partnerships and interdependencies to treat patients.

As a result, designing suitable services to support patients, caregivers and healthcare providers throughout the treatment pathway is essential. However, doing so successfully depends on understanding how to best approach the design of services in this challenging landscape.

Optimising the service behind the personalised and precision medicine is crucial for turning the treatment into a viable and differentiated option for patients. To make a real difference and ensure the therapy is competitive, we need to adopt a service design approach.

Service design is a multidisciplinary art and science that enables us to take a holistic view of the service experience, along with a deep understanding of the target groups, such as patients and healthcare professionals, and the context they operate in. This can include using empathic methodologies, such as in-depth interviews and field studies.

Gaining a comprehensive understanding of the customers needs, how they experience the current service, and how future services address their unmet needs.

Involving different stakeholders throughout the design process to gain a wide range of knowledge and expertise, and to further drive customer-centricity across the business.

Using visual tools such as sketches, maps and prototypes to improve and ease communication and collaboration between the different stakeholders involved in the creative process (surpassing language and knowledge boundaries).

Following a learning-by-doing approach via continuous prototyping and testing to evaluate solutions before investing time and resources on development.

Understanding how the customer experiences the whole service journey and then identifying insight gaps and opportunities for service innovation by looking at the big picture.

Personalised and precision medicines are naturally patient-centred (compared to traditional pharmaceuticals), as the individual patient is central to the product design. Taking this empathic approach throughout the design process provides a deeper understanding of those needs as well as their context.

This means not only adopting collaborative thinking during the design phase but also during production and development.

To deliver these unique therapies to patients, pharmaceutical companies must partner with a wide range of specialised third parties including laboratories, manufacturers, shipping and storing providers.

Looking at the entire service and all of its touchpoints from above is crucial

By engaging with multidisciplinary teams from all levels across the organisation, as well as numerous stakeholders during the co-creation process, you will increase the organisations knowledge and expertise, resulting in better and more fit-for-purpose solutions. Bring this sense of collaboration into the design process to encourage a higher level of consistency, placement and commitment to the patient and ensure they are at the centre of the service philosophy.

Novel therapies require designers to be adaptive. New developments such as changes in the supply chain, shorter genomic sequencing process or the need for an additional quality assurance step, often lead to changes to the envisaged treatment pathway. As a result, it is necessary to have a view of the whole service, in one place, which can be continuously updated.

Visual tools such as customer journey maps and service blueprints are a core part of service design. Journey maps (such as the one featured on p.16) provide an overarching view of the customer experience, along with the pain points, gaps, unmet needs and opportunities for engagement. Service blueprints visualise the process behind the service and the people impacted by it. These tools not only make it easier to understand the service, but they can also help simplify communication and increase alignment between the many individuals engaged in the project.

For personalised and precision medicines, patient journeys and service blueprints can help capture the front-end of the service, which is visible to patients, and the back-end processes, which are used by healthcare professionals. This gives us insights into the interactions, touchpoints and relationships between the patient and various stakeholders, such as the different healthcare professionals, carers and patient groups. Looking at the entire service and all of its touchpoints from above is crucial for making improvements that enrich the customer experience.

CAR-T is a new individualised cancer immunotherapy that has taken precision medicine to a new level. In a nutshell, CAR-T therapy involves extracting T-cells (a type of white blood cells that play a key role in immune response) from the patient, genetically engineering them to target the cancer cells and infusing them back into the patients body.

The CAR-T treatment pathway for a blood cancer involves a uniquely large number of stakeholders, touchpoints and interdependent processes that take place both in the front-end (i.e. visible to the patient) and back-end (i.e. visible to healthcare professionals). Below is a high- level overview of a typical CAR-T journey that can illustrate this complexity:

1. After the patient has identified as a suitable candidate for CAR-T therapy, they are referred by their primary oncologist to a specialised treatment centre to further assess treatment eligibility

2. Once eligibility has been established, the patient undergoes leukapheresis to extract T-cells

3.The samples are sent to a separate facility where they are frozen and prepared for shipping

4. The cells are then sent to a manufacturer where they are genetically engineered to target the patient's cancer cells and multiplied - to create the CAR-T product

5.The product needs to be shipped back to the treatment centre and stored frozen until the patient is ready for infusion

6. The shipping and manufacturing processes can take 34 weeks, during which the patient receives bridging therapy (to slow down disease progression)

7.A few days before the infusion, the patient undergoes lymphodepleting chemotherapy to prepare their body

8. After the infusion, the patient needs to be closely monitored for side effects for 1-3 weeks. Some side effects (e.g. Cytokine Release Syndrome) can require hospitalisation

9. The post-infusion period involves continuous tumour assessment and long-term follow-up

We recently pitched to a pharmaceutical company preparing to launch their new CAR-T therapy to help them design a set of patient-and caregiver-supporting services. We quickly became aware ofthe complicated nature of this therapy and decided to kick off by mapping the treatment pathway and the actors involved.

We normally kick off this type of project by conducting primary research with customers (using empathic methodologies) to generate insights that can inform the journey design. However, due to its novelty, it was difficult to access patients who have recently undergone CAR-T therapy. Instead, we carried out in-depth interviews with different types of stakeholders who had considerable experience working on early CAR-T therapies and clinical trials. This gave us insights into the healthcare professionals experience and visibility into the back-end processes.

The insights we gathered allowed us to understand the experience of patients and their caregivers. We could identify their emotional, practical and information-related needs and highlight the pain points that need to be addressed by the future services.

We also created empathy maps, another tool from the service design toolkit, to visually articulate what we know about the customers.

Once we completed the CAR-T patient and caregiver empathy maps, we created the CAR-T journey. The process relied heavily on co-creation by gathering input from key collaborators from the client company, including both medical and commercial personnel.

The continuous consolidation of insights from primary research, secondary research and stakeholder research was highly iterative. This ensured that the journey captured the envisaged treatment pathway in an accurate and comprehensible manner and that we were able to identify insight gaps as they emerged. From there we could then initiate the required steps to address them through additional research.

When executed correctly, a good customer journey is also adaptive and can be re-worked to reflect the changes that naturally occur over time. This is particularly important for journeys that have beencreated pre-launch and need to be revised, post-launch, to align with the emerging reality of the treatment, and for dealing with complicated treatments that are prone to nuanced changes. Both of these scenarios were true in the case of the CAR-T treatment.

The patient journey can also be used in collaborative design workshops with the client and their partners, as it successfully communicates a complicated pathway in a structured, easily digestible visual manner. It acts as a common language that different collaborators from different roles and backgrounds can use to achieve a shared understanding of the envisaged process and the end-to-end customer experience.

Last and perhaps most importantly it can be used to inform and generate new service ideas collaboratively using the journey as a stimulus, by focusing on key pain points and unmet needs.

This type of work is not possible without service design methodologies. These tools enable a diverse group of professionals from different roles and companies to come together and benefit from holistic, visual, customer-centred tools like empathy maps and customers journeys that make iteration and co-creation possible.

To find out how we can help you design a service for a complex medicine, contactsimon.young@fishawack.com

If you would like to request a free, full copy of our CAR-T Service experience map (snippet pictured above) please get in touch withnatasha.cowan@fishawack.com

See the article here:
In the precision medicine era, the line between products and services is blurred - PMLiVE

Posted in Genetic Engineering | Comments Off on In the precision medicine era, the line between products and services is blurred – PMLiVE

Beej Sheetal to start trials next year – BusinessLine

Posted: at 2:28 am

Jalna-based Beej Sheetal Research Pvt Ltd will commence the second stage biosafety research trials (BRL-II) for Bt Brinjal from April 2021, said the company chairman Suresh Agarwal.

Beej Sheetal has to carry out trials in at least two of the eight states for which it has received approval by the Genetic Engineering Appraisal Committee (GEAC) recently.

GEAC has approved BRL-II trials for Beej Sheetals two transgenic Bt Brinjal Hybrids called Janak and BSS-793.

Jalna district in central Maharashtra is known as the seed hub of the State because a large number of seed companies are located there. The dry climate of the district is most conducive for seed farming and storage.

Agarwal told BusinessLine that Beej Sheetal along with its sister concern Kalash Seeds Pvt Ltd are privately held and are primarily focussed on vegetable seeds for the last 34 years. Beej Sheetal works on biotech research, while Kalash Seeds looks after production and marketing of the vegetable seeds.

Kalash Seeds had a turnover of 180 crore and 30 crore profit for fiscal 2019-2020. Beej Sheetal being a research-oriented company it had a turnover of 40 crore, he said.

Already about 20 crore has been spent on research and other activities for Bt Brinjal, he said.

Agarwal said Beej Sheetal has been working on Bt Brinjal since 2005, but today after 15 years GEAC has given permissions for field trials. Once the trials are over, the field data will have to be submitted again to GEAC. After that they consider the technology for commercialisation.

He pointed out that for the last 10 years there was a moratorium on GM Brinjal, But recently the company wrote to the Prime Minister Narendra Modi a simple letter saying that the company has worked on Bt technology for Brinjal and it can be a real Make in India product.

Visit link:
Beej Sheetal to start trials next year - BusinessLine

Posted in Genetic Engineering | Comments Off on Beej Sheetal to start trials next year – BusinessLine

CRISPR and the Fight Against COVID-19 BRINK News and Insights on Global Risk – BRINK

Posted: at 2:28 am

CRISPR represents a new frontier in gene editing. In comparison to other currently available technologies, it is a less expensive, more specific and simpler-to-use gene editing tool but is not without its criticism or concerns.

Photo: Unsplash

Share this article

Gene editing is often discussed and presented in the context of revolutionizing treatment and diagnostics. In 2012, Jennifer Doudna and Emmanuelle Charpentier demonstrated the potential of CRISPR, which made the promise of gene editing therapies more tangible.

However, COVID-19 has brought to the fore additional applications, most notably exploring how CRISPR can be used as a mechanism to develop non-gene based therapies. Once the process is refined, it could represent another notable step forward for the healthcare sector, although several regulatory hurdles still remain before CRISPR manufactured non-gene editing therapies are widespread.

The CRISPR technology works by coding for a specific gene sequence using guide RNA. When the appropriate DNA sequence is found, the Cas9 protein, working as a pair of scissors, cuts the DNA at the desired location. Once the cut has been made, the gene can be disabled, or missing genetic information can be inserted or replaced.

Due to its characteristics, CRISPR represents a new frontier in gene editing; in comparison to other currently available technologies, it is a less expensive, more specific and simpler-to-use gene editing tool.

The technology is not without its criticism or concerns. Most notably, there is the possibility of off-target mutations, whereby cuts are made in unintended DNA sequences. Additionally, ethical questions related to gene editing as a technology overall remain pervasive among policy makers and the public alike. Fortunately, increased study and greater exposure to the technology are lessening some of these concerns, and it continues to represent a potential and long-awaited therapeutic option for many genetic, and often rare, diseases, for many of which treatments have remained elusive.

COVID-19 and the Ebola epidemic in the mid-2010s illustrated the lengthy time lag of developing preventative therapies. CRISPR technology has the potential to overcome this challenge by significantly accelerating the development of vaccines or therapeutic options to respond to pandemics. This is because CRISPR technology is based on a naturally occurring gene editing system that is found in bacteria, which can be used to fight viruses.

Increased urbanization and contact between different world regions is likely to lead to an increase in the frequency of epidemics, and this new reality has spurred increased interest in CRISPR as a means to quickly respond to disease outbreaks, or even unpredictable seasonal infections. A quick response can help protect healthcare professionals in the short term, and embrace the promise of prevention is better than treatment.

There are a range of different applications for using CRISPR in the development of therapeutic solutions.

Traditional vaccines typically consist of a weakened or dead strand of the virus that it is meant to inoculate patients from viruses. When developing a vaccine, the manufacturer must select viral strands, which are then either grown and incubated in hen eggs or cells.

CRISPR can be used to modify the incubator to increase viral products, which reduces the number of eggs or cells needed in the manufacturing process.

CRISPR technology can be used to engineer B cells, a white blood cell that produces the antibodies that in turn combat pathogens. Using CRISPR, B cells can be injected into patients and provide them with the antibodies to an infection, such as COVID-19, without ever being exposed to the disease itself.

This CRISPR technique effectively skips the step in traditional vaccines of introducing a version of the virus to effectively stimulate the development of an antibody. If the platform, rather than a specific vaccine, can receive authorization, it will allow companies to respond to viral outbreaks in a matter of months, as opposed to years, by simply engineering the appropriate B cells to respond to the viral threat.

The major drawback from this approach is that in the limited studies conducted thus far, it has only offered protection for a very short period of time.

Our increased understanding of genes has given rise to a new class of vaccine known as mRNA vaccines. The concept has become increasingly popular over the last few years, especially during COVID-19, and is the basis of CureVacs approach. The vaccines work by triggering cells to develop antibodies to a specific virus, e.g. SARS-CoV-2,without needing to introduce the virus itself.

Finally, beyond vaccination, there are plenty of other potential uses of CRISPR to combat COVID-19 and viral pandemics in the future. A recent Stanford study used CRISPR to directly target COVID-19 virus (SARS-CoV-2) and disable disease at an RNA level.

Beyond the current pandemic, CRISPR has potential applications to combat other viruses that humanity has been struggling with, ranging from the seasonal flu to HIV. Currently, influenza vaccinations are developed based on hypothesizing about next years viral strain. Due to the manufacturing process, companies cannot quickly respond to a change in the external environment. However, CRISPR introduces a more agile and cheaper manufacturing process, either by improving and reducing the incubation process or aiming to leverage the human body to produce its own antibodies,

Policy has already begun to respond to the potential implications of gene editing as part of a healthcare systems pandemic response mechanism. In a bold step, suggesting a softening of European Union regulations, the European Parliament and Council of the European Union adopted a regulation introducing a temporary derogation to the GMO regulation for clinical trials on vaccines for COVID-19 utilizing gene editing technologies.

This decision could have wider implications for the future use of gene editing for therapeutic use in the EU, especially as the EU looks to review its pharmaceutical strategy for the coming years, where personalized and gene therapies will be a significant component.

COVID-19 has brought the potential of CRISPR-based therapies to combat viral infections further to the foreground and encouraged research in the field. COVID-19 has also played an important role in helping to demystify gene editing to a broader audience. These developments are surely to lead to increased interest and investment in gene editing as a therapeutic solution or manufacturing mechanism outside of the rare and genetic disease space.

The EUs decision to provide an exemption to the GMO regulation could be the first step in a renewed understanding of CRISPR, or it could lead to political backlash. The coming months and years are likely to shape the future of CRISPR and gene editing as a therapeutic solution.

Go here to read the rest:
CRISPR and the Fight Against COVID-19 BRINK News and Insights on Global Risk - BRINK

Posted in Genetic Engineering | Comments Off on CRISPR and the Fight Against COVID-19 BRINK News and Insights on Global Risk – BRINK

New HIV Gene Therapy, CAR-T Treatments Could be on the Horizon for Patients – BioSpace

Posted: at 2:28 am

Could gene therapy provide a solution to HIV? A new research project aims to find out.

The National Institutes of Health(NIH) has backed researchers at the University of Southern California and the Fred Hutchison Cancer Center with a five-year, $14.6 million grant to develop a gene therapy that could potentially control HIV without the need for daily medications. Most HIV patients take a well-regimented cocktail of medications each day to control the virus. This therapy could change that. According to an announcement from the Keck School of Medicine at USC, the goal will be to develop a therapy that prepares patients for a stem cell transplantation using their own cells with little to no toxicity, engineers their own stem cells to fight HIV and stimulates those cells to quickly produce new and engineered immune cells once they're reintroduced into the patient. The hematopoietic stem cell transplants, also known as bone marrow transplants, have been used to treat some blood cancers. The idea is to infuse an HIV patient withhealthy donor blood stem cells that can grow into any type of blood or immune cell.

The gene therapy strategy has been inspired by three cases where leukemia patients who also had HIV received blood stem cell transplants from donors who also carried a mutation that confers immunity to HIV. The mutation was in the CCR5 gene, which encodes a receptor that HIV uses to infect immune cells and is present in about 1 percent of the population, USC said.

The program will engineer blood cells to remove CCR5 from a patient's own stem cells.That will be combined with other genetic changes so that the progeny of engineered stem cells will release antibodies and antibody-like molecules that block HIV.

In addition to the potential gene therapy treatment, researchers are also assessing whether or not CAR-T treatments will benefit HIV patients. Researchers from Harvard University developed a Dual CAR T-cell immunotherapy that can potentially help fight HIV infection. First reported by Drug Target Review, the HIV-specific CAR-T cell is being developed to not only target and eliminated HIV-infected cells, but also reproduce in vivo to enable the patients to fight off the infection. HIVs primary target it T cells, which are part of the bodys natural immune response.

Todd Allen, a professor of Medicine at Harvard Medical School, said the Dual CAR-T cell immunotherapy has so far provided a strong, long-lasting response against HIV-infection while being resistant to the virus itself.

According to the report, theDual CAR T cell was developed through the engineering of two CARs into a single T cell. Each of the CARs contained a CD4 protein that allowed it to target HIV-infected cells and a costimulatory domain, which signaled the CAR T cell to increase its immune functions. As DTR reported, the first CAR contained the 4-1BB co-stimulatory domain, which stimulates cell proliferation and persistence, while the second has the CD28 co-stimulatory domain, which increases its ability to kill infected cells.

To protect the CAR-T cells from HIV, the team added the protein C34-CXCR4, which prevents HIV from attaching to and infecting cells. When that was added, the researchers found in animal models that the treatment was long-lived, replicated in response to HIV infection, killed infected cells effectively and was partially resistant to HIV infection.

Still, other researchers are looking to those rare individuals who are infected with HIV but somehow on their own are able to suppress the virus without the need for any treatment. Researchers have sought to replicate what this small percentage of patients can naturally do in other patients who require those daily regimens of medications. Through the sequencing of the genetic material of those rare individuals, researchers made an interesting discovery.

The team discovered large numbers of intact viral sequences in the elite controllers chromosomes. But in this group, the genetic material was restricted to inactive regions, where DNA is not transcribed into RNA to make proteins, MedNewsToday reported.

Now the race is on to determine how this can be replicated and used to treat the nearly 38 million people across the globe who have been diagnosed with HIV.

Read more here:
New HIV Gene Therapy, CAR-T Treatments Could be on the Horizon for Patients - BioSpace

Posted in Genetic Engineering | Comments Off on New HIV Gene Therapy, CAR-T Treatments Could be on the Horizon for Patients – BioSpace

Biotechnology could change the cattle industry. Will it succeed? – Salon

Posted: at 2:28 am

When Ralph Fisher, a Texas cattle rancher, set eyes on one of the world's first cloned calves in August 1999, he didn't care what the scientists said: He knew it was his old Brahman bull, Chance, born again. About a year earlier, veterinarians at Texas A&M extracted DNA from one of Chance's moles and used the sample to create a genetic double. Chance didn't live to meet his second self, but when the calf was born, Fisher christened him Second Chance, convinced he was the same animal.

Scientists cautioned Fisher that clones are more like twins than carbon copies: The two may act or even look different from one another. But as far as Fisher was concerned, Second Chance was Chance. Not only did they look identical from a certain distance, they behaved the same way as well. They ate with the same odd mannerisms; laid in the same spot in the yard. But in 2003, Second Chance attacked Fisher and tried to gore him with his horns. About 18 months later, the bull tossed Fisher into the air like an inconvenience and rammed him into the fence. Despite 80 stitches and a torn scrotum, Fisher resisted the idea that Second Chance was unlike his tame namesake, telling the radio program "This American Life" that "I forgive him, you know?"

In the two decades since Second Chance marked a genetic engineering milestone, cattle have secured a place on the front lines of biotechnology research. Today, scientists around the world are using cutting-edge technologies, from subcutaneous biosensors to specialized food supplements, in an effort to improve safety and efficiency within the $385 billion global cattle meat industry. Beyond boosting profits, their efforts are driven by an imminent climate crisis, in which cattle play a significant role, and growing concern for livestock welfare among consumers.

Gene editing stands out as the most revolutionary of these technologies. Although gene-edited cattle have yet to be granted approval for human consumption, researchers say tools like Crispr-Cas9 could let them improve on conventional breeding practices and create cows that are healthier, meatier, and less detrimental to the environment. Cows are also being given genes from the human immune system to create antibodies in the fight against Covid-19. (The genes of non-bovine livestock such as pigs and goats, meanwhile, have been hacked to grow transplantable human organs and produce cancer drugs in their milk.)

But some experts worry biotech cattle may never make it out of the barn. For one thing, there's the optics issue: Gene editing tends to grab headlines for its role in controversial research and biotech blunders. Crispr-Cas9 is often celebrated for its potential to alter the blueprint of life, but that enormous promise can become a liability in the hands of rogue and unscrupulous researchers, tempting regulatory agencies to toughen restrictions on the technology's use. And it's unclear how eager the public will be to buy beef from gene-edited animals. So the question isn't just if the technology will work in developing supercharged cattle, but whether consumers and regulators will support it.

* * *

Cattle are catalysts for climate change. Livestock account for an estimated 14.5 percent of greenhouse gas emissions from human activities, of which cattle are responsible for about two thirds, according to the United Nations' Food and Agriculture Organization (FAO). One simple way to address the issue is to eat less meat. But meat consumption is expected to increase along with global population and average income. A 2012 report by the FAO projected that meat production will increase by 76 percent by 2050, as beef consumption increases by 1.2 percent annually. And the United States is projected to set a record for beef production in 2021, according to the Department of Agriculture.

For Alison Van Eenennaam, an animal geneticist at the University of California, Davis, part of the answer is creating more efficient cattle that rely on fewer resources. According to Van Eenennaam, the number of dairy cows in the United States decreased from around 25 million in the 1940s to around 9 million in 2007, while milk production has increased by nearly 60 percent. Van Eenennaam credits this boost in productivity to conventional selective breeding.

"You don't need to be a rocket scientist or even a mathematician to figure out that the environmental footprint or the greenhouse gases associated with a glass of milk today is about one-third of that associated with a glass of milk in the 1940s," she says. "Anything you can do to accelerate the rate of conventional breeding is going to reduce the environmental footprint of a glass of milk or a pound of meat."

Modern gene-editing tools may fuel that acceleration. By making precise cuts to DNA, geneticists insert or remove naturally occurring genes associated with specific traits. Some experts insist that gene editing has the potential to spark a new food revolution.

Jon Oatley, a reproductive biologist at Washington State University, wants to use Crispr-Cas9 to fine tune the genetic code of rugged, disease-resistant, and heat-tolerant bulls that have been bred to thrive on the open range. By disabling a gene called NANOS2, he says he aims to "eliminate the capacity for a bull to make his own sperm," turning the recipient into a surrogate for sperm-producing stem cells from more productive prized stock. These surrogate sires, equipped with sperm from prize bulls, would then be released into range herds that are often genetically isolated and difficult to access, and the premium genes would then be transmitted to their offspring.

Furthermore, surrogate sires would enable ranchers to introduce desired traits without having to wrangle their herd into one place for artificial insemination, says Oatley. He envisions the gene-edited bulls serving herds in tropical regions like Brazil, the world's largest beef exporter and home to around 200 million of the approximately 1.5 billion head of cattle on Earth.

Brazil's herds are dominated by Nelore, a hardy breed that lacks the carcass and meat quality of breeds like Angus but can withstand high heat and humidity. Put an Angus bull on a tropical pasture and "he's probably going to last maybe a month before he succumbs to the environment," says Oatley, while a Nelore bull carrying Angus sperm would have no problem with the climate.

The goal, according to Oatley, is to introduce genes from beefier bulls into these less efficient herds, increasing their productivity and decreasing their overall impact on the environment. "We have shrinking resources," he says, and need new, innovative strategies for making those limited resources last.

Oatley has demonstrated his technique in mice but faces challenges with livestock. For starters, disabling NANOS2 does not definitively prevent the surrogate bull from producing some of its own sperm. And while Oatley has shown he can transplant sperm-producing cells into surrogate livestock, researchers have not yet published evidence showing that the surrogates produce enough quality sperm to support natural fertilization. "How many cells will you need to make this bull actually fertile?" asks Ina Dobrinski, a reproductive biologist at the University of Calgary who helped pioneer germ cell transplantation in large animals.

But Oatley's greatest challenge may be one shared with others in the bioengineered cattle industry: overcoming regulatory restrictions and societal suspicion. Surrogate sires would be classified as gene-edited animals by the Food and Drug Administration, meaning they'd face a rigorous approval process before their offspring could be sold for human consumption. But Oatley maintains that if his method is successful, the sperm itself would not be gene-edited, nor would the resulting offspring. The only gene-edited specimens would be the surrogate sires, which act like vessels in which the elite sperm travel.

Even so, says Dobrinski, "That's a very detailed difference and I'm not sure how that will work with regulatory and consumer acceptance."

In fact, American attitudes towards gene editing have been generally positive when the modification is in the interest of animal welfare. Many dairy farmers prefer hornless cows horns can inflict damage when wielded by 1,500-pound animals so they often burn them off in a painful process using corrosive chemicals and scalding irons. In a study published last year in the journal PLOS One, researchers found that "most Americans are willing to consume food products from cows genetically modified to be hornless."

Still, experts say several high-profile gene-editing failures in livestock and humans in recent years may lead consumers to consider new biotechnologies to be dangerous and unwieldy.

In 2014, a Minnesota startup called Recombinetics, a company with which Van Eenennaam's lab has collaborated, created a pair of cross-bred Holstein bulls using the gene-editing tool TALENs, a precursor to Crispr-Cas9, making cuts to the bovine DNA and altering the genes to prevent the bulls from growing horns. Holstein cattle, which almost always carry horned genes, are highly productive dairy cows, so using conventional breeding to introduce hornless genes from less productive breeds can compromise the Holstein's productivity. Gene editing offered a chance to introduce only the genes Recombinetics wanted. Their hope was to use this experiment to prove that milk from the bulls' female progeny was nutritionally equivalent to milk from non-edited stock. Such results could inform future efforts to make Holsteins hornless but no less productive.

The experiment seemed to work. In 2015, Buri and Spotigy were born. Over the next few years, the breakthrough received widespread media coverage, and when Buri's hornless descendant graced the cover of Wired magazine in April 2019, it did so as the ostensible face of the livestock industry's future.

But early last year, a bioinformatician at the FDA ran a test on Buri's genome and discovered an unexpected sliver of genetic code that didn't belong. Traces of bacterial DNA called a plasmid, which Recombinetics used to edit the bull's genome, had stayed behind in the editing process, carrying genes linked to antibiotic resistance in bacteria. After the agency published its findings, the media reaction was swift and fierce: "FDA finds a surprise in gene-edited cattle: antibiotic-resistant, non-bovine DNA," read one headline. "Part cow, part bacterium?" read another.

Recombinetics has since insisted that the leftover plasmid DNA was likely harmless and stressed that this sort of genetic slipup is not uncommon.

"Is there any risk with the plasmid? I would say there's none,'' says Tad Sonstegard, president and CEO of Acceligen, a Recombinetics subsidiary. "We eat plasmids all the time, and we're filled with microorganisms in our body that have plasmids." In hindsight, Sonstegard says his team's only mistake was not properly screening for the plasmid to begin with.

While the presence of antibiotic-resistant plasmid genes in beef probably does not pose a direct threat to consumers, according to Jennifer Kuzma, a professor of science and technology policy and co-director of the Genetic Engineering and Society Center at North Carolina State University, it does raise the possible risk of introducing antibiotic-resistant genes into the microflora of people's digestive systems. Although unlikely, organisms in the gut could integrate those genes into their own DNA and, as a result, proliferate antibiotic resistance, making it more difficult to fight off bacterial diseases.

"The lesson that I think is learned there is that science is never 100 percent certain, and that when you're doing a risk assessment, having some humility in your technology product is important, because you never know what you're going to discover further down the road," she says. In the case of Recombinetics. "I don't think there was any ill intent on the part of the researchers, but sometimes being very optimistic about your technology and enthusiastic about it causes you to have blinders on when it comes to risk assessment."

The FDA eventually clarified its results, insisting that the study was meant only to publicize the presence of the plasmid, not to suggest the bacterial DNA was necessarily dangerous. Nonetheless, the damage was done. As a result of the blunder,a plan was quashed forRecombinetics to raise an experimental herd in Brazil.

Backlash to the FDA study exposed a fundamental disagreement between the agency and livestock biotechnologists. Scientists like Van Eenennaam, who in 2017 received a $500,000 grant from the Department of Agriculture to study Buri's progeny, disagree with the FDA's strict regulatory approach to gene-edited animals. Typical GMOs are transgenic, meaning they have genes from multiple different species, but modern gene-editing techniques allow scientists to stay roughly within the confines of conventional breeding, adding and removing traits that naturally occur within the species. That said, gene editing is not yet free from errors and sometimes intended changes result in unintended alterations, notes Heather Lombardi, division director of animal bioengineering and cellular therapies at the FDA's Center for Veterinary Medicine. For that reason, the FDA remains cautious.

"There's a lot out there that I think is still unknown in terms of unintended consequences associated with using genome-editing technology," says Lombardi. "We're just trying to get an understanding of what the potential impact is, if any, on safety."

Bhanu Telugu, an animal scientist at the University of Maryland and president and chief science officer at the agriculture technology startup RenOVAte Biosciences, worries that biotech companies will migrate their experiments to countries with looser regulatory environments. Perhaps more pressingly, he says strict regulation requiring long and expensive approval processes may incentivize these companies to work only on traits that are most profitable, rather than those that may have the greatest benefit for livestock and society, such as animal well-being and the environment.

"What company would be willing to spend $20 million on potentially alleviating heat stress at this point?" he asks.

* * *

On a windy winter afternoon, Raluca Mateescu leaned against a fence post at the University of Florida's Beef Teaching Unit while a Brahman heifer sniffed inquisitively at the air and reached out its tongue in search of unseen food. Since 2017, Mateescu, an animal geneticist at the university, has been part of a team studying heat and humidity tolerance in breeds like Brahman and Brangus (a mix between Brahman and Angus cattle). Her aim is to identify the genetic markers that contribute to a breed's climate resilience, markers that might lead to more precise breeding and gene-editing practices.

"In the South,'' Mateescu says, heat and humidity are a major problem. "That poses a stress to the animals because they're selected for intense production to produce milk or grow fast and produce a lot of muscle and fat."

Like Nelore cattle in South America, Brahman are well-suited for tropical and subtropical climates, but their high tolerance for heat and humidity comes at the cost of lower meat quality than other breeds. Mateescu and her team have examined skin biopsies and found that relatively large sweat glands allow Brahman to better regulate their internal body temperature. With funding from the USDA's National Institute of Food and Agriculture, the researchers now plan to identify specific genetic markers that correlate with tolerance to tropical conditions.

"If we're selecting for animals that produce more without having a way to cool off, we're going to run into trouble," she says.

There are other avenues in biotechnology beyond gene editing that may help reduce the cattle industry's footprint. Although still early in their development, lab-cultured meats may someday undermine today's beef producers by offering consumers an affordable alternative to the conventionally grown product, without the animal welfare and environmental concerns that arise from eating beef harvested from a carcass.

Other biotech techniques hope to improve the beef industry without displacing it. In Switzerland, scientists at a startup called Mootral are experimenting with a garlic-based food supplement designed to alter the bovine digestive makeup to reduce the amount of methane they emit. Studies have shown the product to reduce methane emissions by about 20 percent in meat cattle, according to The New York Times.

In order to adhere to the Paris climate agreement, Mootral's owner, Thomas Hafner, believes demand will grow as governments require methane reductions from their livestock producers. "We are working from the assumption that down the line every cow will be regulated to be on a methane reducer," he told The New York Times.

Meanwhile, a farm science research institute in New Zealand, AgResearch, hopes to target methane production at its source by eliminating methanogens, the microbes thought to be responsible for producing the greenhouse gas in ruminants. The AgResearch team is attempting to develop a vaccine to alter the cattle gut's microbial composition, according to the BBC.

Genomic testing may also allow cattle producers to see what genes calves carry before they're born, according to Mateescu, enabling producers to make smarter breeding decisions and select for the most desirable traits, whether it be heat tolerance, disease resistance, or carcass weight.

Despite all these efforts, questions remain as to whether biotech can ever dramatically reduce the industry's emissions or afford humane treatment to captive animals in resource-intensive operations. To many of the industry's critics, including environmental and animal rights activists, the very nature of the practice of rearing livestock for human consumption erodes the noble goal of sustainable food production. Rather than revamp the industry, these critics suggest alternatives such as meat-free diets to fulfill our need for protein. Indeed, data suggests many young consumers are already incorporating plant-based meats into their meals.

Ultimately, though, climate change may be the most pressing issue facing the cattle industry, according to Telugu of the University of Maryland, which received a grant from the Bill and Melinda Gates Foundation to improve productivity and adaptability in African cattle. "We cannot breed our way out of this," he says.

* * *

Dyllan Furness is a Florida-based science and technology journalist. His work has appeared in Quartz, OneZero, and PBS, among other outlets.

This article was originally published on Undark. Read the original article.

Read the original here:
Biotechnology could change the cattle industry. Will it succeed? - Salon

Posted in Genetic Engineering | Comments Off on Biotechnology could change the cattle industry. Will it succeed? – Salon

Husker-led project to advance, standardize field of phenotyping – Nebraska Today

Posted: at 2:28 am

The physical characteristics of a plant can reveal a lot about its underlying genetics. How many kernels of wheat does a single plant produce? How quickly does a corn plant grow? How much water does it use? Understanding a plants physical traits, and linking those traits to specific genes, ultimately drives development of improved crops, higher yields for farmers and greater food securityworldwide.

The science of capturing the characteristics of plants is called phenotyping, and until recently, it has been extremely time- and labor-intensive. Historically, trained researchers relied largely on the sight, touch and feel of a plant to record and understand the phenotype. Nowadays, technologies such as drones, robots, cameras and laser scanners can measure far greater numbers of plantsinstantly.

Plant phenotypes vary from region to region, in large part because plants are good at adapting to environmental variables, such as rainfall and soil composition. However, this variation can also be attributed, in part, to inconsistent protocols, technologies or algorithms used by researchers, said Yufeng Ge, an associate professor in the University of NebraskaLincolns Department of Biological Systems Engineering. Plant phenotyping efforts, although rapidly burgeoning, are still localized and isolated. This creates a series of problems, including difficulty in comparing and interpreting results, underused research data and unknowingly duplicated activities, Gesaid.

With the help of a new $3 million grant from the U.S. Department of Agricultures National Institute of Food and Agriculture, Ge is leading a team of researchers from three universities who are working to changethat.

The grant will bring together researchers from Nebraska, Texas A&M University and Mississippi State University to continue to expand phenotyping research already in progress; to expand the use of drone-based and other high-tech phenotyping methods; to create nationwide standards for collecting, cataloguing and analyzing phenotyping data; to teach the next generation of plant scientists how to organize, understand and effectively use the massive amount of information that high-tech phenotyping practices produce; and to make connections between a plants physical and geneticproperties.

Ultimately, were trying to accelerate the discovery of genes and gene networks in plants that have significance in the production of food, feed, fiber and fuel, Gesaid.

At Nebraska, Ge and other researchers have been using drones and other technology to develop and refine new ways of phenotyping since 2014. In a greenhouse on Nebraska Innovation Campus, plants move along a conveyer belt into different chambers, where theyre photographed by cameras that can identify traits beyond those that can be seen by the naked eye for example, nitrogen content and leaf temperature. At the Eastern Nebraska Research and Extension Center near Mead, a Spidercam on cables moves over a test field, capturing images of plants as they grow and, just as at the Innovation Campus greenhouse, measuring both seen and unseentraits.

The images, measurements and other data that researchers are able to capture using these research methods are invaluable, Ge said, but the technology is cost-prohibitive for widespread use. The grant will allow engineers involved in the project to work toward the development of more affordableversions.

The project also calls for the creation of standards for the types of sensors used and information collected, which will allow researchers from across the United States to easily analyze phenotyping data capturedanywhere.

Ge and others will also collaborate with the Genome to Fields Initiative, which brings together researchers from an array of organizations, including Iowa State University, the University of Wisconsin and the Iowa Corn Growers Association, to better understand the function of corn genes across differentenvironments.

The project paves the way for a nationwide, drone-based phenotyping network where tools and protocols are standardized, experiments are coordinated and efforts are concerted, and data can be seamlessly shared, Gesaid.

For the potential of a nationwide phenotyping network to be fully realized, Ge said there must also be a nationwide network of researchers equipped to work with the vast amounts of data such a network would produce. To meet that need, Ge and others involved in the project are creating a curriculum that teaches the next generation of plant scientists the tools they will need to understand and use the data successfully. Students will need to understand the sensors that collect the measurements, as well as the measurements and images themselves. They also will need to know how to use phenotyping data with other data sets sequencing data, weather and precipitation data, soil data and more that together can provide a clear picture of how genes and outside factors work together to determine a plantscharacteristics.

Archie Clutter, director of the Agricultural Research Division at Nebraska, said the award takes the universitys already renowned phenotyping program to the nextlevel.

This project has the potential to change the entire discipline of phenotyping, starting with the way data are collected all the way through the way that data are analyzed and shared, Clutter said. Its an impressive, ambitious and very important project that could have a huge impact on how new crop lines aredeveloped.

Husker researchers involved in the project include Ge, James Schnable, Stephen Baenziger, Yeyin Shi, Leah Sandall, Vikas Belamkar and Frank Bai. The team at Texas A&M University is led by Seth Murray and Amir Ibrahim, and the team at Mississippi State University is led by AlexThomasson.

Excerpt from:
Husker-led project to advance, standardize field of phenotyping - Nebraska Today

Posted in Genetic Engineering | Comments Off on Husker-led project to advance, standardize field of phenotyping – Nebraska Today