Daily Archives: November 20, 2019

Human genome | Britannica

Posted: November 20, 2019 at 5:49 am

Human genome, all of the approximately three billion base pairs of deoxyribonucleic acid (DNA) that make up the entire set of chromosomes of the human organism. The human genome includes the coding regions of DNA, which encode all the genes (between 20,000 and 25,000) of the human organism, as well as the noncoding regions of DNA, which do not encode any genes. By 2003 the DNA sequence of the entire human genome was known.

The human genome, like the genomes of all other living animals, is a collection of long polymers of DNA. These polymers are maintained in duplicate copy in the form of chromosomes in every human cell and encode in their sequence of constituent bases (guanine [G], adenine [A], thymine [T], and cytosine [C]) the details of the molecular and physical characteristics that form the corresponding organism. The sequence of these polymers, their organization and structure, and the chemical modifications they contain not only provide the machinery needed to express the information held within the genome but also provide the genome with the capability to replicate, repair, package, and otherwise maintain itself. In addition, the genome is essential for the survival of the human organism; without it no cell or tissue could live beyond a short period of time. For example, red blood cells (erythrocytes), which live for only about 120 days, and skin cells, which on average live for only about 17 days, must be renewed to maintain the viability of the human body, and it is within the genome that the fundamental information for the renewal of these cells, and many other types of cells, is found.

The human genome is not uniform. Excepting identical (monozygous) twins, no two humans on Earth share exactly the same genomic sequence. Further, the human genome is not static. Subtle and sometimes not so subtle changes arise with startling frequency. Some of these changes are neutral or even advantageous; these are passed from parent to child and eventually become commonplace in the population. Other changes may be detrimental, resulting in reduced survival or decreased fertility of those individuals who harbour them; these changes tend to be rare in the population. The genome of modern humans, therefore, is a record of the trials and successes of the generations that have come before. Reflected in the variation of the modern genome is the range of diversity that underlies what are typical traits of the human species. There is also evidence in the human genome of the continuing burden of detrimental variations that sometimes lead to disease.

Knowledge of the human genome provides an understanding of the origin of the human species, the relationships between subpopulations of humans, and the health tendencies or disease risks of individual humans. Indeed, in the past 20 years knowledge of the sequence and structure of the human genome has revolutionized many fields of study, including medicine, anthropology, and forensics. With technological advances that enable inexpensive and expanded access to genomic information, the amount of and the potential applications for the information that is extracted from the human genome is extraordinary.

Since the 1980s there has been an explosion in genetic and genomic research. The combination of the discovery of the polymerase chain reaction, improvements in DNA sequencing technologies, advances in bioinformatics (mathematical biological analysis), and increased availability of faster, cheaper computing power has given scientists the ability to discern and interpret vast amounts of genetic information from tiny samples of biological material. Further, methodologies such as fluorescence in situ hybridization (FISH) and comparative genomic hybridization (CGH) have enabled the detection of the organization and copy number of specific sequences in a given genome.

Understanding the origin of the human genome is of particular interest to many researchers since the genome is indicative of the evolution of humans. The public availability of full or almost full genomic sequence databases for humans and a multitude of other species has allowed researchers to compare and contrast genomic information between individuals, populations, and species. From the similarities and differences observed, it is possible to track the origins of the human genome and to see evidence of how the human species has expanded and migrated to occupy the planet.

Continue reading here:
Human genome | Britannica

Posted in Genome | Comments Off on Human genome | Britannica

BioEssays Editor: ‘Junk’ DNA Full of Information! Including Genome-Sized Genomic Code – Discovery Institute

Posted: at 5:49 am

How many times have we heard it claimed that the vast majority of the human genome is junk and therefore could not have been designed? Even in the face of overwhelming evidence from the ENCODE project and numerous other studies showing that most of our genome has biochemical function, most evolutionists still maintain that our genomes are largely junk. But a few brave scientists, including some rare evolutionists, have been willing to buck that trend.

In a new article at Advanced Science News That Junk DNA Is Full of Information! Andrew Moore, the Editor-in-Chief of the respected biology journal BioEssays, comments on a new BioEssays paper. The paper finds that our DNA contains overlapping layered dual-function pieces of information, including a genomic code that spans virtually the entire genome in order to defin[e] the shape and compaction of DNA into the highly-condensed form known as chromatin. More about that paper in just a moment. It was written by leading Italian biologist Giorgio Bernardi who played a major role in the discovery of isochores. Isochores are important in this story. But for now, lets look at Moores essay. It has something worth mentioning in almost every paragraph.

Moore starts by saying that it should not be surprising that there is more function in the genome than we initially expected:

It should not surprise us that even in parts of the genome where we dont obviously see a functional code (i.e., one thats been evolutionarily fixed as a result of some selective advantage), there is a type of code, but not like anything weve previously considered as such.

From an intelligent design (ID) perspective, Moore is absolutely correct: finding more function in the genome should not surprise us. But Moore is not an ID proponent; hes clearly writing from an evolutionary perspective. Even as he describes extensive function in our genome, he frequently adds evolutionary narrative gloss just to remind you what side hes on. But within the evolutionary perspective, his support for mass genomic functionality does not represent the majority. There is a long history of evolutionary biologists predicting that non-protein-coding DNA is largely junk. (See Post-ENCODE Posturing: Rewriting History Wont Erase Bad Evolutionary Predictions.) As one example, in 1980 Francis Crick and Leslie Orgel wrote that Much DNA in higher organisms is little better than junk, and it would be folly in such cases to hunt obsessively for its function. Numerous similar claims have been made over the years.

Though clearly evolution-based, Moores perspective stands out in an important way: it is open to seeing coordinated function across the entire genome. Moore thus proposes an idea with which ID proponents would heartily agree:

And what if it [this other code] were doing something in three dimensions as well as the two dimensions of the ATGC code? A paper just published in BioEssays explores this tantalizing possibility

So there are multiple layers of information in DNA controlling cellular processes that operate in multiple dimensions. Not only that, but as Moore explains, these codes are frequently overlapping within our DNA sequence:

One of the intriguing things about DNA sequences is that a single sequence can encode more than one piece of information depending on what is reading it and in which direction viral genomes are classic examples in which genes read in one direction to produce a given protein overlap with one or more genes read in the opposite direction (i.e., from the complementary strand of DNA) to produce different proteins. Its a bit like making simple messages with reverse-pair words (a so-called emordnilap). For example: REEDSTOPSFLOW, which, by an imaginary reading device, could be divided into REED STOPS FLOW. Read backwards, it would give WOLF SPOTS DEER.

Though highly specified and difficult to produce by chance, overlapping codes are demonstrably present in our DNA. Proponents of intelligent design have long identified overlapping genes as a signature of design. For example, one chapter in the volume Biological Information: New Perspectives argues that Multiple Overlapping Genetic Codes Profoundly Reduce the Probability of Beneficial Mutation. The chapter observes that, DNA sequences are typically poly-functional with overlapping protein-coding sequences which can contribute to multiple overlapping codes simultaneously. But the likelihood of producing such information-rich, tightly constrained sequences by chance is exceedingly low: it is difficult to understand how poly-functional DNA could arise through random isolated mutations.

How do overlapping codes relate to the current situation? Moore explains that these dual-function pieces of information are found throughout our genome where DNA can both encode proteins and simultaneously define a genomic code:

For two distinct pieces of information to be encoded in the same piece of genetic sequence we would, similarly, expect the constraints to be manifest in biases of word and letter usage the analogies, respectively, for amino acid sequences constituting proteins, and their three-letter code. Hence a sequence of DNA can code for a protein and, in addition, for something else. This something else, according to Giorgio Bernardi, is information that directs the packaging of the enormous length of DNA in a cell into the relatively tiny nucleus. Primarily it is the code that guides the binding of the DNA-packaging proteins known as histones. Bernardi refers to this as the genomic code a structural code that defines the shape and compaction of DNA into the highly-condensed form known as chromatin.

This genomic code is thus a genome-wide feature, woven throughout our DNA, including portions of the genome that evolutionists have typically assumed had no function. This code is defined by the GC content of a stretch of DNA the level of base pairs that are guanine-cytosine (hence GC) rather than adenine-thymine. In protein-coding DNA, the third base-pair in codons can often vary from AT/TA to CG/GC without affecting the amino acid being specified. Evolutionists have presumed that the precise nucleotide in this third base pair was irrelevant, so long as the codon was synonymous, and that variation in the third nucleotide represented an unimportant non-functional feature. But Moore explains that the third nucleotide in a codon can have great functional importance apart from merely specifying the amino acid, and could actually help define this genomic code, which overlaps with the protein-code:

Protein-coding sequences are also packed and condensed in the nucleus particularly when theyre not in use (i.e., being transcribed, and then translated into protein) but they also contain relatively constant information on precise amino acid identities, otherwise they would fail to encode proteins correctly: evolution would act on such mutations in a highly negative manner, making them extremely unlikely to persist and be visible to us. But the amino acid code in DNA has a little catch that evolved in the most simple of unicellular organisms (bacteria and archaea) billions of years ago: the code is partly redundant. For example, the amino acid Threonine can be coded in eukaryotic DNA in no fewer than four ways: ACT, ACC, ACA or ACG. The third letter is variable and hence available for the coding of extra information. This is exactly what happens to produce the genomic code, in this case creating a bias for the ACC and ACG forms in warm-blooded organisms. Hence, the high constraint on this additional code which is also seen in parts of the genome that are not under such constraint as protein-coding sequences is imposed by the packaging of protein-coding sequences that embody two sets of information simultaneously.

Moores evolutionary bias is evident here as he repeatedly adds narrative gloss, ascribing functional aspects of our genome to evolution, rather than simply describing the functional nature of DNA and leaving evolution out of it. But the substance of what hes saying identifies function in an aspect of the genome that evolutionists have frequently ignored as junk.

He goes on to explain that this genomic code is not limited to protein-coding sequences, overlapping with the code that specifies protein sequences. The code also persists throughout giant portions of our genome, characterized by repetitive sequences that evolutionary scientists have, again, frequently ignored as junk. Read the following carefully, and try to filter out the gloss. It basically admits that these massive segments of our genome are functional:

But didnt we start with an explanation for non-coding DNA, not protein-coding sequences? Yes, and in the long stretches of non-coding DNA we see information in excess of mere repeats, tandem repeats and remnants of ancient retroviruses: there is a type of code at the level of preference for the GC pair of chemical DNA bases compared with AT. As Bernardi reviews, synthesizing his and others groundbreaking work, in the core sequences of the eukaryotic genome, the GC content in structural organizational units of the genome termed isochores increased during the evolutionary transition between so-called cold-blooded and warm-blooded organisms. And, fascinatingly, this sequence bias overlaps with sequences that are much more constrained in function: these are the very protein-coding sequences mentioned earlier, and they more than the intervening non-coding sequences are the clue to the genomic code. In eukaryotic genomes, the GC sequence bias proposed to be responsible for structural condensation extends into non-coding sequences, some of which have identified activities, though less constrained in sequence than protein-coding DNA. There it directs their condensation via histone-containing nucleosomes to form chromatin.

What we see here is that major portions of our genome, traditionally viewed as junk, are actually full of information in excess of mere repeats, tandem repeats and remnants of ancient retroviruses because there is a type of code at the level of preference for the GC pair of chemical DNA bases compared with AT. The purpose of the code, in short, is to direct DNA-packing in the nucleus.

The genomic code is largely defined by huge GC-biased portions of the genome called isochores. When you hear the word isochore, think of humongous portions of our genome characterized by repetitive sequences of DNA that most evolutionists have typically ignored as junk, but that ID proponents have predicted as probably having function.

Giorgio Bernardis paper in BioEssays provides an extensive discussion of the literature. It shows that isochores have functional importance and that the GC level of isochores defines a vital genomic code. Bernardi explains:

[T]he genomic code, which is responsible for the pervasive encoding and molding of primary chromatin domains (LADs and primary TADs, namely the gene spaces/spatial compartments) resolves the longstanding problems of non-coding DNA, junk DNA, and selfish DNA leading to a new vision of the genome as shaped by DNA sequences.

Bernardis view is that most of the genome is functional, contradicting the typical junk DNA perspective:

By the end of the 1980s, our knowledge of the isochore organization of the human genome had not only rejected what had been called the bean-bag view of the genome, that is, a collection of genes randomly scattered over vast expanses of junk DNA; but it had also indicated that the genome is an integrated structural, functional, and evolutionary system. This view arose from a comparative study of vertebrate genomes, centered on the analysis of their compositional patterns, namely of the compositional distributions of large DNA segments, coding sequences, and introns.

Thus, the presence of GC-rich isochores leads us to reject the junk DNA view. It indicates that the genome is an integrated structural, functional, and evolutionary system. Ignoring Bernardis evolutionary gloss, which wrongly assumes that integrated structural and functional systems can arise by blind evolutionary mechanisms, his statement is exactly what ID theory would expect. Bernardi continues explaining how we know that isochores are functional and carry the genomic code which overlaps with the genetic code:

The functional importance of isochores was already evident in the 1980s because of the correlations of their GC levels with all the genome properties tested. It was later confirmed by investigations carried out in the 1990s. The first indications that the base composition of isochores was under constraint came from the strong correlations between the composition of interspersed repeats, such as the GC-poor LINES and GC-rich SINES, and the composition of the GC-poor and GC-rich isochores, respectively, in which those sequences were located. The next step was the extension of the compositional correlations to genes (exons, introns, codon positions) located in GC-poor and GC-rich isochores, correlations that affect codon usage and amino acid composition of the encoded proteins. These points were subsequently reinforced, leading to the proposal that a genomic code was responsible for the compositional correlations just mentioned. As shown in Table S3, Supporting Information, the genomic code was further extended in the following years to include the sequence distributions, the functional properties associated with GC-poor and GC-rich isochores, and the structure and nuclear location of interphase chromatin.

Only recent investigations showed, however, that the genomic code: 1) is a structural code in that it directly encodes and molds chromatin structures and defines nucleosome binding; 2) is pervasive because it applies to the totality of the genome; 3) overlaps the genetic code and constrains it, by affecting the composition (but not the function) of coding sequences (and contiguous non-coding sequences), codon usage, and amino acid composition of the encoded proteins, as already mentioned.

Moores article, describing Bernardis findings, concludes strikingly:

These regions of DNA may then be regarded as structurally important elements in forming the correct shape and separation of condensed coding sequences in the genome, regardless of any other possible function that those non-coding sequences have: in essence, this would be an explanation for the persistence in genomes of sequences to which no function (in terms of evolutionarily-selected activity), can be ascribed (or, at least, no substantial function).

We may marvel at such complicated structures and ask but do they need to be quite so complicated for their function? Well, maybe they do in order to condense and position parts of the protein in the exact orientation and place that generates the three-dimensional structure that has been successfully selected by evolution. But with a knowledge that the genomic code overlaps protein coding sequences, we might even start to become suspicious that there is another selective pressure at work as well

Moore doesnt specify what the other selective pressure is, but clearly he sees the functionally important genomic code as pervasive throughout the genome. So heres what we have: evolutionary scientists proposing that most of our genomes sequence has functional importance because it carries a genomic code, controlling the three-dimensional packing in the nucleus. This code even overlaps with the genetic code in protein-coding DNA. Such a perspective directly contradicts the evolutionary paradigm of a genome flooded with junk.

Why would evolutionary scientists like Moore and Bernardi step outside that paradigm? The answer is simple: Their views are driven by the data. Moore or rather, more power to them!

Photo byAnn Kathrin BoppviaUnsplash.

Read the original:
BioEssays Editor: 'Junk' DNA Full of Information! Including Genome-Sized Genomic Code - Discovery Institute

Posted in Genome | Comments Off on BioEssays Editor: ‘Junk’ DNA Full of Information! Including Genome-Sized Genomic Code – Discovery Institute

Biomedical Engineering Researcher to Explore Immune System Impact on Genome Engineering – University of Arkansas Newswire

Posted: at 5:49 am

Photo Submitted

A biomedical engineering faculty member has received a $50,000 award to pursue a new line of research into the immune systems role in genome engineering.

Christopher Nelson, assistant professor of biomedical engineering, earned the funding through the American Society of Gene and Cell Therapys Career Development program. Nelson holds the 21stCentury Professorship in Biomedical Engineering.

His research program is adapting genome editing technologies to treat genetic disease. Genetic diseases are caused by a change in the DNA sequence away from the normal sequence. Nelsons research involves modifying DNA by using molecular scissors, known as CRISPR, to make precise genome modifications with a goal of fixing disease-causing mutations.

The ASGCT award adds a new direction to Nelsons research that will explore how the immune system impacts emerging genome engineering therapies.

This line of research is important to the field, as high-profile failures in the past warn that the immune system plays a critical role in the success or failure of biomolecular therapies, Nelson said.

Nelsons previous research in animal models of muscular dystrophy has shown the bodys immune response to CRISPR may provide an additional barrier to translating genome editing therapies to clinical use, he said. That work was published inNature Medicineearlier this year.

The immune response to delivery vehicles or genome editing technologies could prevent successful gene editing or cause a dangerous host response, Nelson said. Full characterization of these risks and strategies to avoid a host response are needed for clinical development.

The project will launch new lines of work in Nelsons lab by providing preliminary data for future research related to immune cell biology and immune tolerance.

Raj Rao, head of the department of biomedical engineering, said the award is an important step for Nelsons research.

I am extremely proud of Chris for receiving the ASGCT Career Development Award and for pursuing this potentially transformative project that seeks to better understand the impact of cutting-edge gene editing technologies on the immune system, he said.

Read the rest here:
Biomedical Engineering Researcher to Explore Immune System Impact on Genome Engineering - University of Arkansas Newswire

Posted in Genome | Comments Off on Biomedical Engineering Researcher to Explore Immune System Impact on Genome Engineering – University of Arkansas Newswire

Is genome sequencing the answer to rare diseases? – Hyderus Cyf

Posted: at 5:49 am

Rare diseases despite the infrequency that the name confers cumulatively affect around 350 million people worldwide. This figure includes roughly seventy million Indians. Could genome sequencing prove to be the tool that begins to spark medical innovation and diagnosis for the millions affected by such disorders?

The Economic Times notes a recent case in which genome sequencing has proven to be an effective means of diagnosis for conditions classified as rare diseases. A woman, who was at the time three months pregnant, came to see a paediatrician at Government Medical College in Kozhikode wanting to know if there was a way to find out if her unborn child could develop a rare immune disease her firstborn was suffering from.

The article notes that the white blood cells in her six year-old daughters body behaved abnormally, leading to organ infections and rimpaired growth. Doctors, so far, had been unable to cure the girl, she said. While the name of the condition is never revealed in the article, many rare diseases also fall under the categorisation of autoimmune disorders, such as Ashersons syndrome.

The woman underwent an antenatal procedure to retrieve genetic material from the unborn child. This was analysed and compared to genetic testing performed on the six-year-old daughter. The tests brought the welcome news that the womans baby would not have the same genetic disorder as her previous child.

This test had the luxury of comparing directly to a relative. However, most testing has to be compared to databases with one major drawback. No reliable genetic information of Indians is available and for research, our scientists have to rely on gene data banks from the US and the UK or of Caucasians, said Council of Scientific and Industrial Research (CSIR) director-general Shekhar C. Mande.

As of now, Indian genomes only represent 0.2 percent of the global genetic databanks. The current majority of genomes around 96 percent are of European ancestry. Ignoring the study of Indian genes leaves what could be a medical goldmine all but untapped.

Previous genomic studies conducted in India have uncovered genes exclusive to the Indian population that present a unique risk in developing diabetes. Knowledge of risks such as these can allow for the Indian medical system, as well as government policy, to be better informed when making decisions regarding disease prevention and treatment.

Like Loading...

Related

Here is the original post:
Is genome sequencing the answer to rare diseases? - Hyderus Cyf

Posted in Genome | Comments Off on Is genome sequencing the answer to rare diseases? – Hyderus Cyf

Co-creator of CRISPR lectures about future applications of genome editing technology – Daily Bruin

Posted: at 5:49 am

A University of California professor and co-originator of genome editing technology Clustered Regularly Interspaced Short Palindromic Repeats said researchers plan to expand the technology in order to increase human applications at a campus lecture series Thursday.

Jennifer Doudna, a UC Berkeley biochemistry professor, engaged students and the greater UCLA science community during the quarterly Donald J. Cram Distinguished Lecture series.

The Cram lecture series, a quarterly departmental event, invites prominent academics in the field of chemistry to speak about their research. The series is dedicated to Donald J. Cram, who was a Nobel laureate and a chemistry professor at UCLA for over 50 years.

This fall, the series was hosted by UCLA chemistry professor and Cram Chair Patrick Harran.

Scientists use CRISPR technology, formally known as CRISPR-Cas9, to modify DNA sequences and gene functions. Cas9 is a protein that can cut the strands of DNA-like molecular scissors.

CRISPR is studied and used by students, scientists and researchers to advance progress in the field of gene editing, in medicine and the life sciences.

The UC holds the largest CRISPR patent portfolio in the nation with 16 total patents, according to a UC Berkeley press release.

The United States Patent and Trademark Office granted the UC, along with the University of Vienna and Emmanuelle Charpentier, the director of the Max Planck Institute for Infection Biology, its 16th patent in October.

Doudnas involvement in CRISPR technology began around 2005, when a professor at UC Berkeley, Jill Banfield, invited Doudna to help her with research into the mechanism. From there, Doudna teamed up with Charpentier, who was working with a CRISPR system and its associated protein, Cas9, in 2011.

Doudna is one of the creators of the CRISPR utility for the permanent excision of harmful genes. Doudna said that she developed the idea for the CRISPR technology in 2011 in collaboration with Charpentier.

During the lecture, Doudna detailed how scientists regulate CRISPR enzymes to modify DNA.

CRISPR is a portion of the bacterial genomic sequence that acts as an adaptive immune system, Doudna said.

Bacteria encode the CRISPR system through viral infections, which allows its genome to recognize foreign DNA insertions. These DNA sequences incorporate themselves into the bacterial genome at the CRISPR locus, a genetic database of past infections.

Doudna said this locus was of unique interest to her.

Those sequences, called CRISPR, are transcribed in RNA molecules that provide the zip codes for Cas proteins, allowing them to recognize foreign DNA and cut it up, Doudna said.

Doudna and Charpentier, with the assistance of their team, were able to realize that CRISPR RNA is a 20-nucleotide sequence, which interacts with DNA in a complementary fashion.

This complementarity allows the protein to form a double-stranded break in DNA, necessitating a second RNA tracrRNA to form this functional unit, Doudna said.

And it was (biochemist) Martin Jinek in our lab who figured out that you could combine these two RNAs into a single guide RNA, Doudna said.

From this experiment, Jinek found that single guide RNAs were used by Cas9 to excise DNA at specific sites in a plasmid, a circular piece of bacterial DNA. The revelation from this was that, upon excision, DNA would repair itself in animals and plants, Doudna said.

Doudna said at the end of her talk that the system is becoming increasingly important in the field of medicine, and is currently being used at UCLA, by Donald Kohn, a professor of microbiology, immunology and molecular genetics.

Were within about five years, maybe less, from being able to make, essentially, any change to any genome in any type of cell, Doudna said.

Doudna stressed that this ability to make changes in the genome comes with bioethical responsibility for genome editing in humans.

Fourth-year biochemistry student Jeremy Shek, who attended the event, said although he had done a project that was an offshoot of CRISPR, he had not heard of the progress Doudna discussed.

It is important to be informed on advancements and progress in the field, he added.

Fourth-year bioengineering student Timothy Yu said he came to the lecture to see Doudna in person and get a more solid grasp on the methodology of CRISPR.

Lexi Omholt, a fourth-year microbiology, immunology and molecular genetics student, said that she came to the talk to understand the basis of CRISPR technology.

Jennifer Doudna was one of the reasons I chose my major, Omholt said. At that time, CRISPR came into popular knowledge, and the knockout tool was just coming into use. I am involved in a cancer lab, the Soragni Lab, that uses CRISPR-Cas9 on a regular basis.

Follow this link:
Co-creator of CRISPR lectures about future applications of genome editing technology - Daily Bruin

Posted in Genome | Comments Off on Co-creator of CRISPR lectures about future applications of genome editing technology – Daily Bruin

Genome research gave life back to this West Vancouver cancer survivor – Vancouver Is Awesome

Posted: at 5:49 am

Candy Woodworth knows shes won the lottery.

In the past five years, shes seen a daughter get married and celebrate the births of two grandchildren.

But for a while, whether the West Vancouver grandmother would be around to mark those milestones was far from certain.

Six years ago, Woodworth was a busy 65-year-old, looking after her first grandchild while her daughters took care of the family business.

It was during a Pilates class when she was lying on her stomach that she first noticed something odd an uncomfortable feeling in her lower abdomen. Woodworth didnt think much of it, but when she felt it again at next weeks class, she made an appointment to see her doctor, who sent her for an ultrasound.

When she got back home, the phone was ringing before she even had her coat off, telling her to come to her doctors office right away.

There she got the news that she had ovarian cancer.

According to the BC Cancer Agency, over 300 women in B.C. will be diagnosed with ovarian cancer this year. Its not nearly as common as breast cancer women have about a one in 70 lifetime chance of getting ovarian cancer but the prognosis can be far more serious.

You just dont feel anything, said Woodworth. Thats the difficult thing with ovarian cancer.

Because there is no way to screen for ovarian cancer, and the disease is usually without symptoms until at an advanced stage, effective treatment is often a challenge.

We have treatments that are very likely to cause the cancer to regress and improve but theres a very high risk of recurrence, said Dr. Anna Tinker, a medical oncologist at BC Cancer who is one of the leading experts in gynecological cancers and who worked on Woodworths case.

Woodworth knew she was facing a serious diagnosis. So she did some research and was referred to the expert team that specializes in gynecological cancer at Vancouver General Hospital, headed by Dr. Dianne Miller.

Woodworth had surgery to remove the tumour from her abdomen, which was confirmed as a high-grade Stage 3 aggressive cancer.

But her journey was only just beginning.

For the next four and a half months, Woodworth had 18 rounds of chemotherapy. After my third week I literally crawled on my hands and knees into the chemo clinic, she said. I was literally throwing up as I was sitting in the chair.

She credits her support team of her husband and three daughters for getting her through it. And the chemotherapy worked at first.

But 18 months later, the cancer was back, with a tumour on her colon. She had another surgery.

Throughout the process, My attitude was always Lets get in there. Lets get the job done, she said.

When the tumour returned again in the same place, six months later, Woodworths doctors signed her up for an experimental research program, the Personalized Onco-Genomics program, run by a team of doctors and researchers at the BC Cancer Agency.

The program which is usually only open to patients after standard treatments have been tried takes a novel approach to cancer, looking for genetic mutations in a patients tumour for clues to whats causing the cancer to grow, and with that, a possible treatment.

In Woodworths case, the analysis showed her tumour had a signature similar to that seen when a BCRA gene mutation is present more usually associated with some types of breast cancer, said Tinker.

In early 2017 Woodworths results were matched with an experimental drug, Olaprib Lynparza.

In Woodworths case, the drug worked. Shes now been on it for two and a half years with no side effects and no recurrence in her cancer.

The 12 capsules she takes every day down from the number she started on have literally saved her life.

Im so grateful for every day, said Woodworth. I dont think the public realizes the scientists we have here in Vancouver.

Woodworth is among the more dramatic success stories to come out of the personalized genomic research project, falling into a small group of super responder patients.

Others include a Langley woman whose metastatic breast cancer was beaten back by a drug commonly used to treat diabetes, in addition to hormone treatment.

Another Metro Vancouver woman was saved when scientists discovered her advanced colon cancer had a protein that responded to blood pressure medication.

Since the program started in 2012, 1,136 patients, including 123 children, have been enrolled in the program.

Patients who take part need to understand the process is experimental, said Tinker. While helpful new information is gleaned in about 80 per cent of cases, the result is not always as dramatic as it was in Woodworths case and not all cancer patients are helped by the genome analysis.

In some patients, no helpful mutations are discovered that can be used as clues to treatment and in some cases, no drugs are a match.

Cancer patients start new treatments as a result of their genome results about 40 per cent of the time.

Ideally, patients who are matched with treatments can be enrolled in clinical trials that make expensive drugs available to them free of charge, said Tinker.

But thats not the always the case.

Woodworth knows shes lucky. I knew what I was up against, she said, but she remained stubbornly optimistic, describing herself as a glass half full kind of person.

These days, Woodworth who recently celebrated her 70th birthday takes delight in spending time with her grandkids.

I cant let a day go by without stopping by for a quick hug, she said. I dont stay around and clean my house. I get out there.

I dont take anything for granted. Thats the one thing you take away when you feel that mortality. You have to live every day the best that you can.

She hopes stories like hers will lead to money for research that will benefit other cancer patients.

The research at the Personalized Onco-Genome program is funded by approximately $22.7 million from the BC Cancer Foundation, largely raised through philanthropic donations, as well as by research grants, particularly through the Canada Foundation for Innovation.

Hopefully theyll find more [information on how cancers behave], and more people will survive, said Woodworth. Thats what I want for everyone. There is hope out there.

To find out how to donate to the research funded by the BC Cancer Foundation, including the Personalized Onco-Genomics program, click here.

To find out how to donate to the VGH/UBC Hospital Foundation, which benefits programs including the Ovcare research team examining gynecological cancers, click here

To view a CBC Nature of Things documentary on the Personalized Onco-Genomics program, which aired on the network in February 2017, click here

Originally posted here:
Genome research gave life back to this West Vancouver cancer survivor - Vancouver Is Awesome

Posted in Genome | Comments Off on Genome research gave life back to this West Vancouver cancer survivor – Vancouver Is Awesome

Can Wheat Save the World? – Seed World

Posted: at 5:49 am

The wheat genome is complex can we figure it out in time to feed a growing world population?

A half-century ago, human beings took their first steps on the moon.

At first, its hard to imagine what that has to do with wheat, but for attendees of the first International Wheal Congress held in Saskatoon, Sask. earlier this year, the connection was apt.

I was 12 years old when that happened. Being a young child and seeing this on a small black-and-white TV was impressive, said Martin Kropff, director-general of the International Maize and Wheat Improvement Center (CIMMYT), who spoke on the opening day of the congress. Two men walked on the moon, but the fact is, the moon landing was the result of 500,000 people working together.

If we can put a man on the moon, we can solve 800 million people going to bed hungry every day. Wheat is a crucial part of that challenge.

Hosted by the University of Saskatchewan, the event brought together 900 researchers, agronomists and other scientists from 50 countries to talk about all things T. aestivum and T. turgidum.

The challenges ahead were the main focus as attendees zeroed in on the fact the world population is growing and more food is needed, specifically cereals which Kropff said will comprise a third of all calories and protein in the human diet in the future.

But for food prices to remain constant, annual yield gains in wheat would have to increase from 1.2 to 1.7%.

Thats no small challenge, Kropff added.

As noted by Tim Searchinger, senior fellow at the World Resources Institute (WRI), the reality is that agriculture occupies half the worlds vegetated land. That means for agriculture, the sheer task of feeding the world is a huge challenge for biodiversity and ecosystems especially since agriculture produces a quarter of the worlds greenhouse gas emissions.

Searchinger presented the WRIs recent report Creating a Sustainable Food Future, for which he was lead author. It is laid out as a five-course menu of solutions to ensure we can feed 10 billion people by 2050 without increasing emissions, fueling deforestation or exacerbating poverty.

Searchinger noted that between 2010 and 2050, food production must rise 56% in order to feed a growing population and reduce greenhouse gas emissions by two-thirds in the process.

Wheat can play a huge role in that, he said.

According to the report, to provide continuing yield gains, breeding will need to become more nuanced.

In the past, much yield gain in the major cereals like wheat resulted from shifting biomass from vegetative parts to seeds and shortening and stiffening of the stems so they could support more grain (resulting from higher fertilizer application) without falling over. These traits, which were largely responsible for the Green Revolution, are in some cases reaching their biological limits; crops can only grow so close to one another before they have no more space, and crops can only direct so much of their growth into edible portions before they will no longer stand upright, the reports authors state.

These limits, plus the need to boost crop yields even faster than in historical trends, present the crop breeding challenge.

As a result, four major related opportunities exist to increase crop yields through improved breeding: speeding up crop breeding cycles, marker-assisted and genomics-assisted breeding, improvement of orphan crops, and genetic modification. Searchinger emphasized that all these technologies play a role in creating new wheat for the world.

Thats why were here. The work youre doing is incredibly important, he said, and added that four recommendations to enable innovation in wheat include boosting breeding budgets, sharing genomic advances, leveraging new technologies, and increasing research on orphan crops.

According to the WRI report, the world probably devotes only around 1.41.7% of agricultural GDP to agricultural research and development, which is less than the rate of total research spending relative to the total global economy (2.1%).

Richard Gray, agricultural economist at the University of Saskatchewan, gave a talk titled Successes and Failures in International Wheat Royalty Collection. He said strengthening plant breeders rights through royalty collection is one way to ensure more stable funding for variety development, an initiative currently underway in Canada through an attempt to create a value creation system via either a trailing royalty or end-point royalty.

But there are challenges. According to Gray, UPOV 1991 Plant Breeders Rights alone has generally failed to create a viable private wheat breeding industry.

Producer support is an essential element of increased royalty collection and support has come where producers have some long-term ownership in wheat breeding programs, he said. Public and producer partnerships have played an important role in providing additional breeding resources while enhancing knowledge sharing.

This first IWC event was a merger of two previously parallel wheat symposia: the International Wheat Genetics Symposium that took place every five years and the International Wheat Conference held every four years. The two groups agreed to join their efforts to create IWC, said international organizing committee chair Hermann Brstmayr.

Wheat is in terms of acreage the largest crop on our globe. Wheat is needed for food, feed and materials in countless ways and wheat is a staple food for around two billion people, many of whom live in [developing] countries. Research has to play its role to deliver know-how, improved production tools and improved cultivars to make wheat production sustainable, he said.

Challenges are plentiful, as they have always been. Certainly, the more erratic weather extremes will be an important issue, cultivars need possibly more resilience and buffering capacity than before. Heat stress is very likely to increase. Also, resource efficiency particularly nutrient efficiency, such as nitrogen and phosphorous efficiency will gain more relevance. And wheat production is expanding into non-traditional areas, such as sub-Saharan Africa which means production systems need to be established for these regions.

Creating a new generation of wheat that is tolerant to heat stress, drought stress, excess moisture and a constantly-evolving army of pests will require ongoing efforts to collaborate globally which in many ways is already happening. University of Saskatchewan researchers led by wheat breeder Curtis Pozniak who helmed the events Canadian organizing committee played a key role in mapping the wheat genome as part of an international consortium.

The bread wheat genome is five times bigger than the human genome its a beast. The effort required to undertake cutting edge research like wheat genome sequencing is massive, said Richard Cuthbert, wheat breeder at the Agriculture and Agri-Food Canada Swift Current Research & Development Centre.

There are over 110,000 genes in bread wheat. Employing new technologies like gene editing will depend on how we can dissect complex traits to identify the genes that underlie them and how those genes work together. Were standing on the cliff of the next frontier in wheat. Now that we know what the genes are, we need to know how they work and interact with each other.

Also during the event, Genome Canada announced an investment of $11.2 million to go toward some exciting new research spearheaded by Pozniak and fellow wheat researcher Sylvie Cloutier of Agriculture and Agri-Food Canada.

Known as 4D Wheat: Diversity, Domestication, Discovery and Delivery, this research will use wild-wheat relatives and elite germplasm along with industry-leading genomic techniques to better understand wheats genetic potential. The study will also examine the economics and policies of using wild-wheat germplasm sources and germplasm from international sources.

Pozniak and Cloutiers work will be based out of the Crop Development Centre at the University of Saskatchewan and Agriculture and Agri-Food Canadas Ottawa Research & Development Centre, respectively.

Follow this link:
Can Wheat Save the World? - Seed World

Posted in Genome | Comments Off on Can Wheat Save the World? – Seed World

Posthumanism Theory – Technical Communication Body of …

Posted: at 5:48 am

About Posthumanism TheoryIn as brief a definition as possible, humanism is centered on the idea that human needs, values, concerns, and ideals are of the highest importance, or that the human being is the epitome of being. As a development of this idea, posthumanism is based on the notion that humankind can transcend the limitations of the physical human form. In a traditional sense, humans have been considered to be solidly and indisputably classified as high-functioning animals, but animals nonetheless. In this way, the same biological and physical constraints that limit the entire animal kingdom tether humankind to that base level. Posthumanism Theory suggests it is both possible and for the best for humans to attempt to surpass these limitations, often through the use of technology to augment biology (in a way, using the physiological capacity of the human brain to accelerate the functions of the entire human form).This progressive mentality is an important aspect of the human condition to consider in the course of modern document design and technical rhetoric. Operating under posthumanism ideals requires authors and creators to venture into the hypothetical and the unexplored because these are the areas that build upon and even improve what we already have established. Posthumanism holds this sentiment at heartthe idea that we, as humans, have no inherent barrier to making our physical and mental functionality much more efficient and powerful than it currently is. To apply these ideals to writing and rhetoric, there is the potential to incorporate the conventions of posthumanism both integrally and progressively. Integrally, a posthuman text should reflect the central ideas of posthumanism: what can authors do to make their texts transcend the perceived limitations of text and writing? How can documents be made to do more than what they currently can do, and how can their readability, usability, and accessibility be expanded? Progressively, a posthuman text should relatably adapt for evolutions in interaction: it might explore such questions as how will human interaction with documents change in the next 10, 20, 50, or 100 years? How can texts encourage mental expansion? What changes in technology can be predicted and accounted for in the delivery and interaction with documents and writing?Progressions in Usability and FunctionalityWhile the primary focus of posthumanist progression lies in the realm of higher technology, there are developments both in effect and yet to come that have much to do with technical writing and rhetoric. For many, many centuries, writing has been constrained to paper with static text. In more recent decades, the advent of computers and the Internet have caused documents to evolve and adapt. Institution of newer technologies allows for new methods of interactivity, which allow different senses to be utilized by human beings who interact with such documents. Through the use of technology, document designers and writers can allow their readers to interact at a more functional level which is more natural and fully engaging than mere reading.The qualities of new media enable documents and their interactive elements to tap into the human mind to a higher degree. In that way, technology is being utilized to better the human experience and tap into the full range of human capability. New developments in technology such as mobile phones, touch screens, e-readers, and other similar technology afford better interactivity and have evolved the way humans interact with their professional and social worlds. Technology is always changing to accommodate more natural, intuitive means of interactivitybut the most posthuman aspect of this technological innovation creep is the ubiquity of technology that allows delivery of writing and documents. Technology has filled in an accessibility gap that now grants access to documents and writing not only on printed paper, but on desktop computers, laptop computers, smartphones, and other such devices. This technology augments human beings' functionality from two directionsit enhances the ability of the audience to read and respond to writing, and it also enhances the ability of the author to create and distribute his or her writing.Posthumanist rhetoric requires a full understanding of the operation of the human being as an entity, both collectively as an audience and singularly as individual readers. Writing and rhetoric are able to be at their most posthuman when they utilize technology to transcend the physicality of humans as well as the temporality of their existence. In this way, authors begin accommodating more means of delivery and spreading the availability and accessibility of documents in addition to making documents available at much more timely intervalseven as far as on-demand. Posthuman authors who embrace technological advances gain new dimensions of interactivity both within their text as well as in response to their text. A posthuman rhetoric mindset enables the document to blossom further as a medium as it works in harmony with the qualities of its audience and their humanity.

Read the rest here:

Posthumanism Theory - Technical Communication Body of ...

Posted in Posthumanism | Comments Off on Posthumanism Theory – Technical Communication Body of …

Critical Posthumanism Critical Posthumanism Network

Posted: at 5:48 am

This entry originally appeared in Rosi Braidotti and Maria Hlavajova, eds., Posthuman Glossary (London: Bloomsbury Academic, 2018). Reproduced with permission.

Critical posthumanism is a theoretical approach which maps and engages with the ongoing deconstruction of humanism (cf. Badmington 2000).[1] It differentiates between the figure of the posthuman (and its present, past and projected avatars, like cyborgs, monsters, zombies, ghosts, angels etc.) and posthumanism as the social discourse (in the Foucauldian sense) which negotiates the pressing question of what it means to be human under the conditions of globalisation, technoscience, late capitalism and climate change (often, very problematically, by deliberately blurring the distinctions between science fiction and science fact [cf. science faction in Herbrechter, 2013]).[2]

The prefix post- (in analogy with the discussion of the postmodern and postmodernism following Lyotard [1992])[3] has a double meaning: on the one hand, it signifies a desire or indeed a need to somehow go beyond humanism (or the human), while on the other hand, since the post- also necessarily repeats what it prefixes, it displays an awareness that neither humanism nor the human can in fact be overcome in any straightforward dialectical or historical fashion (for example, in the sense: after the human, the posthuman).

The critical in the phrase critical posthumanism gestures towards the more complicated and non-dialectical relationships between the human and the posthuman (as well as their respective dependence on the nonhuman). Posthumanism in this critical sense functions more like an anamnesis and a rewriting of the human and humanism (i.e. rewriting humanity, in analogy with Lyotards notion of rewriting modernity).[4] Critical posthumanism asks a number of questions that address these complications: how did we come to think of ourselves as human? Or, what exactly does it mean to be human (especially at a time when some humans have apparently decided that they are becoming or have already become posthuman)? What are the motivations for this posthumanising process and when did it start? What are its implications for nonhuman others (e.g. the environment, animals, machines, God, etc.)?

The adjective critical in the phrase critical posthumanism thus signifies at least two things. It refers to the difference between a more or less uncritical or popular (e.g. in many science fiction movies or popular science magazines) and a philosophical and reflective approach that investigates the current postanthropocentric desire. This desire articulates itself, on the one hand, in the form of an anticipated transcendence of the human condition (usually through various scenarios of disembodiment an approach (and an entire movement) that is best designated by the term transhumanism) and, on the other hand, through a (rather suspicious) attempt by humans to argue themselves out of the picture precisely at a time when climate change caused by the impact of human civilisation (cf. Anthropocene) calls for urgent and responsible, human action.

The other meaning of critical is a defence and possibly a reinvention of some humanist values and methodologies which, in the face of a fundamental transformation provoked by digitalisation and the advent of ubiquitous computing and social media, appear to have become obsolete, or to be in urgent need of revision (esp. critical methodologies which are related to traditional forms of literacy, reading and thinking). The question here is how to remain critical in the sense of developing reading techniques, forms of conceptualisations and subjectivities that are both self-reflexive and aware of their own genealogies (i.e. able to stay critically connected with humanist traditions and esp. literal, literary and textual approaches).

Studies of literatures 21st-century extensions[5] have questioned the broader resonances of the idea that the literary is currently being overtaken by processes of digitalisation, globalisation and technoscientific change. In this current supposedly post-literary moment, a critical posthumanist (and countertextual) approach is both aware and wary of the contemporary desire to leave the humanist apparatus of literacy and its central institution of literature, with all its social, economic and cultural-political implications, its regimes of power and its aesthetics behind.

To counter the trend of seeing posthumanism merely as the next theory fashion, my Posthumanism: A Critical Analysis[6] takes as its starting point the question as to what extent poststructuralism and deconstruction have anticipated current posthumanist formulations and critiques of subjectivity. This aspect is particularly important with regard to the current discussion about the relevance and future role of the humanities. The first academic publications that systematically engage with the idea of the posthuman and posthumanism appear in the late 1990s and early 2000s (in books and articles by Neil Badmington, Rosi Braidotti, Elaine L. Graham, N. Katherine Hayles, Cary Wolfe and others), all of which approach posthumanism through a more or less poststructuralist or deconstructive lens. They do so, however, by embracing two new aspects: a return to or of the question of technology (as it had been provocatively formulated by Heidegger)[7] and the question of the future of the humanities.

An increasing part of the academy and the (theoretical) humanities in particular have been embracing this new context to form new, interdisciplinary alliances with the sciences and critical science studies (e.g. with Bruno Latours actor-network-theory, speculative realism, or new feminist materialism). One major aspect concerns the redefinition of the relationship between humans and technology or the role of the history of technics for human (and non-human) evolution. Donna Haraways early work on the cyborg (in the 1980s) received the widespread discussion it deserved. Attempts to rethink the ontological aspect of technology and the political role of technological determinism, however, also look at previous philosophies of technology (esp. in Heidegger, Ellul and Simondon), most prominently, in Bernard Stieglers work. In the aftermath of the so-called science wars, which highlighted at once the necessity of cultural recuperations of scientific practice and the call for a new dialogue between the sciences and the humanities, the new or posthumanities (cf. the title of Cary Wolfes influential series with University of Minnesota Press) are set to overcome the traditional two cultures divide at last.

This is, however, happening under extremely adverse conditions: the material base of an increasingly globalised advanced and neoliberal capitalism, and the transition from analog (humanist, lettered, book or text-based) to digital (posthumanist, code, data or information-based) societies, cultures and economies. The currently emerging posthumanities therefore have to engage with the positive but also the problematic aspects of the transformative potential that a new dialogue or alliance between the humanities and the sciences contains. The focus on the posthuman as a discursive object, on posthumanism as a social discourse and on posthumanisation as an ongoing historical and ontological process, allows both communities the humanities and sciences to create new encounters and test new hypotheses that may lead to greater political and ethical awareness of the place of the human, the nonhuman and their environments (especially in connection with pressing issues like climate change, depletion of natural resources, the destruction of biodiversity, global migration flows, terrorism and insecurity, biopolitics etc.).

Basically, what is at stake is a rethinking of the relationship between human agency, the role of technology and environmental and cultural factors from a post- or non-anthropocentric perspective.[8] Postanthropocentric posthumanities are still about humans and the humanities but only in so far as these are placed within a larger, ecological, picture (cf. for example the institutionalisation of medical humanities, environmental humanities, and digital humanities).The latter, in particular, will have to address the role of new and converging media and their social and cultural implications, as well as the proliferation of digital and virtual realities and their biopolitical dimensions (e.g. new forms of surveillance and commodification, new subjectivities and biomedia [cf. Thacker, 2004]).[9]

Critical posthumanism thus draws together a number of aspects that constitute our early twenty-first-century reality and cosmology and links these back genealogically to their beginnings and prefigurations within humanism itself (cf. Herbrechter & Callus, 2005 and 2012).

Coventry University, November 2017

[1] Neil Badmington, ed., Posthumanism (Houndmills: Palgrave Macmillan, 2000).

[2] Stefan Herbrechter, Posthumanism: A Critical Analysis (London: Bloomsbury Academic, 2013), pp.107-34.

[3] Jean-Franois Lyotard,The Postmodern Explained to Children: Correspondence 1982-1985 (Sydney: Power Publications, 1992).

[4] Jean-Franois Lyotard, The Inhuman: Reflections on Time, trans. Geoffrey Bennington and Rachel Bowlby (Cambridge: Polity Press, 1991).

[5] See for instance the journal, CounterText (Edinburgh: Edinburgh University Press), accessible at: <http://www.euppublishing.com/journal/count>.

[6] Herbrechter, 2013.

[7] Martin Heidegger, The Question Concerning Technology and Other Essays, trans. William Lovitt (New York: Harper and Row, 1977) pp.283-318.

[8] Rosi Braidotti (2013), The Posthuman (Polity Press: Cambridge).

[9] Eugene Thacker (2004), Biomedia (University of Minnesota Press: Minneapolis).

Further Reading

Ellul, Jacques, Le Systme technicien (Paris: Le Cherche Midi, 2004 [1977]).

Graham, Elaine P., Representations of the Post/Human: Monsters, Aliens and Others in Popular Culture (Manchester: Manchester University Press, 2002).

Haraway, Donna, When Species Meet (Minneapolis: University of Minnesota Press, 2008).

Haraway, Donna, Simians, Cyborgs, and Women: The Reinvention of Nature (New York: Routledge, 1991).

Hayles, N. Katherine, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics (Chicago: University of Chicago Press, 1999).

Herbrechter, Stefan and Ivan Callus, eds., Posthumanist Shakespeares (Houndmills: Palgrave Macmillan, 2012).

Herbrechter, Stefan and Ivan Callus, eds., Cy-Borges: Memories of the Posthuman in the Work of Jorge Luis Borges (Lewisburg: Bucknell University Press, 2005).

Latour, Bruno, Reassessing the Social: An Introduction to Actor-Network-Theory (Oxford: Oxford University Press, 2005).

Simondon, Gilbert, Sur la technique (1953-1983) (Paris: Presses Universitaires de France, 2014).

Stiegler, Bernard, Technics and Time, (3 vols.) (Stanford: Stanford University Press, (2008-2011).

Wolfe, Cary, What Is Posthumanism? (Minneapolis: University of Minnesota Press, 2010).

Read this article:

Critical Posthumanism Critical Posthumanism Network

Posted in Posthumanism | Comments Off on Critical Posthumanism Critical Posthumanism Network